No one who is even vaguely familiar with the ongoing farce that is Brexit (which probably means everyone!) can be unaware of the extraordinary amount of propaganda that is peddled by both sides. Diametrically opposed facts are provided by politicians with completely straight faces, even though at least one of them knows that they are telling a considerable untruth.
Something similar happens in the IT world, especially when it comes to emerging technologies. The proponents of the new idea are quick to tell anyone who cares to listen that everyone is embracing the new technology and they’d be foolish to get left behind. Meanwhile, the companies who produce the ‘traditional’ technology which is under threat from the new one, tell the opposite tale – that no one is using the new-fangled, unproven, highly suspect, unstable product/software. The truth might lie somewhere in the middle.
Nothing astounding with this observation one might think. And no, there isn’t, but I just thought it was worth reminding one and all that anything presented as a fact needs to be carefully checked before it can be acknowledged as such.
There’s no doubting that, right now, the technology world is doing some amazing work, but quite how amazing it is, and quite how helpful it is to customers (as opposed to companies who simply wish to save costs) remains open to debate.
AI is a great example. Critics say it will never replace human thinking; supporters say it’s merely a question of time before the world’s human workforce is replaced by robots. And the truth lies somewhere between these two extremes.
And, of course, there may be a world of difference between what it is possible to achieve with AI, and what makes commercial (if not moral) sense.
No one will ever convince me that automated customer service will ever be as good as speaking to a competent human – the automated offering never covers all possible reasons why a customer might wish to make contact. However, chatbots and the like make perfect commercial sense if you are only interested in saving large amounts of money and not too bothered if you alienate a few customers.
So, moving forwards, all business owners have to beware the claims of both the new and old technologies, and also understand what will be the impact on the customer base of any changes in approach. Such decisions might just make solving Brexit appear like a minor problem in comparison!
NTT reveals lack of strategic ownership is stalling digital transformation plans.
According to NTT Ltd.’s 2019 Digital Means Business Report, only 11% of organizations are highly satisfied with those in charge of spearheading digital transformation, despite the fact that almost three-quarters of them are already underway on their journey.
Organizations worldwide are achieving some success with digital transformation, but there’s still a strong belief that this evolution requires radical, far-reaching changes to achieve success. This, when combined with a lack of strong transformational leadership and focus on the need to change people, is holding many companies back:
Wayne Speechly, VP of Advanced Competencies, NTT Ltd. said: “Organizations are still grappling with how to shape their business to capitalise on a connected future. Digital creates the opportunity for value to be constantly derived from transformation initiatives across the business. Organizations should focus less on perfecting a grand digital plan, and more on taking considered and iterative steps in their transformation journey to progress value and clarity of subsequent moves. For various reasons, an organization is its own worst enemy, so any change has to be supported by pragmatic, self-aware leadership who are themselves changing.”
Almost half of business leaders are failing to achieve a positive financial return from digital transformation projects they have executed despite considering themselves to be ‘digital thinkers’, reveals a new Censuswide survey.
The poll of 250 business leaders at public and private sector organisations with more than 1000 employees, commissioned by leading HR and payroll provider MHR, found that while 90 percent of business leaders have been responsible for commissioning one or more digital transformation projects only 54 percent believed they were financially benefitting the organisation.
This is despite 95 percent of business leaders perceiving themselves to be ‘digital thinkers’ and over four-fifths (84 percent) believing that they personally have the necessary digital skills required to oversee digital transformation projects in their organisation.
Michelle Shelton, Product Planning Director at MHR says; “The research highlights that while business leaders are confident in their own abilities to oversee digital change, the reality is that many projects are failing to deliver the financial benefits.
“One of the key drivers for implementing digital change is to deliver cost savings and revenue growth, but this is only achievable if people with the right skills, including a strong financial awareness, are spearheading the change.
“Ahead of carrying out a digital transformation project, it’s important to collaborate with all departments to create a joint strategy and establish a change team responsible for delivering the change.
“By adopting a collaborative approach organisations can leverage the skills and expertise of its people, and gain a true understanding of its current operation to establish a clear vision for the future.
“Digital transformation projects will almost certainly fail unless you take your people on the journey with you. Subsequently, any change team should naturally include HR. As stewards of company culture HR professionals can ensure any changes are successfully embedded and embraced by its people, and play an active role in helping create more a ‘digital savvy’ workforce by recruiting new talent to plug any skills gaps and arranging training for existing employees to support the adoption of new software.”
GoTo by LogMeIn has published the findings from a new global survey conducted by Ovum Research. The survey of 2,100 IT buyers and leaders found that communications and collaboration tools were “business critical” to the success of organisations, and investments in these tools need to be made a priority in order to support a growing remote workforce and the rise of digital natives in the office.
As businesses plan for 2020 and beyond, collaboration tools are a major focus for unified communications and collaboration (UCC) deployments, with 73% of survey respondents expecting spending to increase. However, it’s not just about finding a collaboration platform, it’s about finding one that meets the needs of a changing workforce that is seeing an increase in the number of employees working remotely, and digital natives entering the workforce as full-time employees. In fact, the survey found that 93% of respondents agreed that digital natives have different needs and expectations in the workplace, and over half of CIOs (56%) are looking to grow their collaborative software offering to meet that demand.
Digital Natives Need to Be Front and Center of Planning:
According to the survey, C-Suite IT leaders prioritised these items as steps they’re taking in anticipation of the growing digital native workforce:
IT Leaders Need to Provide for Today and Future Proof For What’s To Come:
IT leaders play a more strategic role than ever before. They need to consider whether or not to adopt new technology, and accommodate and support a diverse and dispersed workforce, all while keeping costs down and showing ROI for their decisions. For example:
AI Continues To Be Top of Mind:
AI capabilities are continuously improving in ways that help employees. In the coming year, more and more IT leaders will adopt AI technology for smarter, more efficient collaboration:
“Today’s CIOs and IT leaders need to play a more strategic role than ever before. They’ve got a new seat at the table and are expected to drive overall business strategy. The very nature of the way people work is changing and that change needs to be supported through great technology that is simple to use, easy to adopt and painless to manage,” said Mark Strassman, Senior Vice President and General Manager of UCC at LogMeIn. “IT leaders need to find technology partners that are meeting demands of the modern workforce. They need to support digital natives and remote employees to optimize today, modernize for tomorrow and set their employees and business up for long-term success.”
Aryaka has published its third annual 2019 State of the WAN report that reveals SD-WAN, cloud and application performance challenges, priorities and plans for 2019 and beyond.
When comparing this year’s results to the 2018 report, a pattern emerges: more respondents identified complexity, even surpassing performance, as the biggest challenge with their WAN. As applications and cloud connectivity become more complex, so do the networks required to support them. Organizations may recognize this, but don’t always have the expertise or resources to deliver on their digital transformation objectives.
“Our research on migration to SD-WAN concurs with Aryaka’s latest survey results regarding the complexities of managing the underlying WANs in enterprise networks,” said Erin Dunne, Director of Research Services at Vertical Systems Group. “We are seeing more enterprises choose managed SD-WAN solutions focused on providing dynamic WAN connectivity to ensure optimal end-to-end performance for all types of business-critical applications.”
Study Methodology
The third annual Global Aryaka 2019 State of the WAN study surveyed 795 global IT and network practitioners at companies across all verticals primarily headquartered in North America (57 percent), Europe (20 percent) and Asia—excluding China—(12 percent), and with up to 1,000 employees (31 percent), up to 10,000 employees (32 percent) and over 10,000 employees (24 percent). The survey asked respondents about their networking and performance challenges, priorities and their plans for 2019 and beyond.
What follows are a few of the key findings from this year’s report.
Cloud Models and New Applications Are Driving Digital Transformation
The majority of surveyed enterprises operate in highly distributed and complex IT environments. Over one-third have 20 or more branches around the globe. Half leverage five or more cloud providers or SaaS applications, and almost 15 percent have over 1000 applications deployed. These trends impact the enterprise’s ability to properly provision, optimize, troubleshoot, and secure their WAN and multi-cloud environments.
Network and Application Performance is Paramount
With lines of business moving at a much faster pace, the WAN needs to evolve to meet the needs of digital transformation. However, traditional architectures do not effectively enable a multi-cloud approach due to their misalignment with a cloud consumption model predicated on flexibility. This results in cost, complexity and performance challenges. 40 percent of those surveyed said cost was a challenge for them (7 percent higher than 2018). 35 percent said their challenge was around the high complexity, manageability and maintenance (14 percent higher than 2018). And 24 percent said they had concerns around slow access to cloud services and SaaS applications (a 3 percent decrease from 2018). With the limited visibility available, enterprises are split on the source of their application challenges, with 19 percent stating the branch, 23 percent the middle-mile, and 24 percent the application origin.
Unified Communications-as-a-Service (UCaaS) Challenges
One common complaint is UCaaS performance across traditional WANs due to latency and packet loss. Challenges include poor quality at 41 percent (an 11 percent drop from 2018), lag and delay at 31 percent (12 percent higher than 2018), and management at 28 percent (18 percent higher than 2018). Clearly, management is becoming a greater concern.
Innovation Drives a Better Experience
Traditional carriers and do-it-yourself (DIY) deployments are not equipped to handle the agility required for digital transformation. Solving slow application performance and managing vendors are the top time-sucks for IT organizations. 45 percent of respondents said slow application performance is leading to poor user experience in branch offices. 36 percent said slow application performance is leading to poor user experience for remote and mobile users (nine percent higher than 2018). And 28 percent said managing telcos or service providers is a nightmare (12 percent higher than 2018).
Managed SD-WAN is the Future
As organizations plan to the future, their top IT priorities are advanced security (34 percent), cloud migration (31 percent), IT automation (28 percent), and big data and analytics (28 percent). Yet traditional WAN and DIY SD-WAN solutions can’t always support these initiatives, and based on the growing number of respondents who have issues with managing their telcos, the situation is only getting worse. A fully managed global SD-WAN solution promises to provide flexibility, visibility, enhanced performance and the cost control required in a cloud-first era. And, the characteristics that enterprises look for in any SD-WAN solution closely track their overall IT priorities – 47 percent are looking for cloud and SaaS connectivity, 43 percent advanced security, 37 percent WAN optimization and application acceleration, and 34 percent are looking to replace their MPLS network.
“We are living in a complex multi-cloud and multi-SaaS application world. As global enterprises continue to innovate by embracing new technologies and migrating to the cloud, they also face new challenges,” said Shashi Kiran, CMO of Aryaka. “Whether it’s an increasing number of global sites through expansion, poor performing cloud-based applications, increasing costs or the time it takes to manage multiple vendors, many organizations are at an inflection point: transform the WAN now or risk falling behind and losing out to competitors.”
Wipro has released its 2019 State of Cybersecurity Report that highlights the rising importance of cybersecurity defense to global leaders, the emergence of the CISO as a C-Suite role, and an unprecedented focus on security as a pervasive part of the business operations.
The study found that one in five CISOs are now reporting directly to the CEO, 15% of organizations have a security budget of more than 10% of their overall IT budgets, 65% of organizations are tracking and reporting regulatory compliance, and 25% of organizations are carrying out security assessments in every build cycle. In addition, 39% of organizations now have a dedicated cyber insurance policy. All of these points showed dramatic increases from previous years.
The annual study is based on three months of primary and secondary research, including surveys of security leadership, operational analysts, and 211 global organizations across 27 countries.
Additional highlights from the report:
Yet organizations are aligning themselves to cyber-resilient strategies in new ways:
Raja Ukil, Global Head for Cybersecurity & Risk Services, Wipro Limited said, “With organizations riding the digital wave, security strategies need to be enhanced to address the changing landscape and enable a smooth and safe transition. Security is also evolving to be a pervasive part of core business operations, and countries are establishing active cyber defence strategies and functions to foster partnerships with the private sector enterprises and with other countries. Amidst growing threats, leaders are collaborating more than ever before in new and innovative ways to mitigate the risks.”
Respondents identify people as biggest source of cyber threats, with Facebook and BA as most notable breaches – but skills shortage has bolstered employment prospects.
A lack of resources is the single biggest challenge for the IT security market, followed by a lack of experience and skills, according to “The Security Profession in 2018/19” report from the Chartered Institute of Information Security (previously known as the IISP) – the independent not-for-profit organisation responsible for promoting professionalism and skills in the IT profession. At least 45 percent of respondents chose a lack of resources as the biggest challenge: compared to 37 percent for a lack of experience, and 31 percent for a lack of skills. Ultimately, security professionals feel their budgets are not giving them what they need – only 11 percent said security budgets were rising in line with, or ahead of, the cyber security threat level, while the majority (52 percent) said budgets were rising, but not fast enough.
Professionals were also clear about where threats originate. Overwhelmingly, 75 percent perceived people are the biggest challenge they face in cyber security – with processes and technology near-equal on 12 and 13 percent respectively. This may explain the need for more resources even as budgets increase: people are a far more complex issue to deal with. Yet at the same time, there are signs of improvement. More than 60 percent of IT professionals say that the profession is getting better – or much better – at dealing with security incidents when they occur, with only 7 percent saying the profession is getting worse. Conversely, less than half (48 percent) of respondents felt the industry is getting better at defending systems from attack and protecting data, with 14 percent saying the profession is getting worse. This suggests an ongoing move in the industry – from focusing on prevention, to an all-encompassing approach to security.
“IT security is a constant war of attrition between security teams and attackers, and attackers have more luxury to innovate and try new approaches,” said Amanda Finch, CEO, Chartered Institute of Information Security. “As a result, the industry’s focus on dealing with breaches after they occur, rather than active prevention, isn’t a great surprise – the former is where IT teams have much more control. Yet in order to deal with breaches effectively, security teams still need the right resources and to increase those in line with the threat. Otherwise they will inevitably have to make compromises.”
Other relevant statistics from the research included:
The focus on a lack of resources, experience and skills suggests that IT security teams are feeling the effect of the IT skills shortage. Yet this is also an opportunity for individuals. The majority of IT security professionals surveyed believe this is a good time to join the profession – 86 percent say the industry will grow over the next three years and 13 percent say it will “boom”. There is also an opportunity, and need, for women in the industry – 89 percent of respondents identified as male, and 9 percent as female. More than 37 percent say they have better prospects than a year ago, and the factors attracting people to take security jobs are the same as then – remuneration, followed by scope for progression and variety of work. Insufficient money, or a lack of opportunity, also cause people to leave security positions – yet the top factor causing people to leave their jobs is bad or ineffectual management.
“In the middle of a skills shortage, organisations need to treat their workers carefully. Losing them through a lack of investment, through failing to help develop skills, or simple poor management, cannot be allowed,” continued Amanda Finch. “At the same time, they cannot simply hire anyone to fill the skills gap – bringing the wrong person into a role can be a greater risk than an empty seat. Instead, organisations must understand what roles they need to fill; what skills those roles demand; and what skills applicants have. Armed with this, businesses can fill roles and support workers throughout their careers with the development, opportunities and training they need. This doesn’t only mean developing technical skills, but the social, organisational and strategic skills that are essential to put security at the heart of the business.”
Link11, a leader in cloud-based anti-DDoS protection, has published its DDoS statistics for Q2 2019. The data shows that the quarter saw a massive 97% year-on-year increase in average attack bandwidth, up from 3.3Gbps in Q2 2018 to 6.6Gbps in Q2 2019.
These attacks are easily capable of overloading many companies’ broadband connections. There are several DDoS-for-hire services offering attacks between 10 and 100 Gbps for a modest fee. Currently, one DDoS provider is offering free DDoS attacks of up to 200 Mbps bandwidth for a duration of five minutes.
The maximum attack volumes seen by Link11 between April and June 2019 also increased by 25% year-on-year, to 195Gbps from 156Gbps in Q2 2018. In addition, 19 more high-volume attacks with bandwidths over 100 Gbps were registered in Q2 2019.
Rolf Gierhard, Vice President Marketing at Link11 said: "Too many companies still have the wrong idea when it comes to the threat posed by DDoS attacks. Our data shows that the gap between attack volumes, and the capability of corporate IT infrastructures to withstand them, is widening from quarter to quarter. Given the scale of the threat that organizations are facing, and the fact that the attacks are deliberately aimed at causing maximum disruption, it’s clear that businesses need to deploy advanced techniques to protect themselves against DDoS exploits."
Increasing complexity of attacks
Multi-vector attacks posed an additional threat in Q2 2019, with a significant increase in complex attack patterns. The proportion of multi-vector attacks grew from 45% in Q2 2018 to 63% in the second quarter of 2019. Attackers most frequently combined three vectors (47%), followed by two vectors (35%) and four vectors (15%). The maximum number of attack vectors seen was seven.
Further findings from Link11’s Q2 DDoS statistics include:
New research from Databarracks has revealed organisations are getting better at understanding what IT downtime costs their business.
Data taken from its recently released Data Health Check survey reveals less than one in five organisations (19 per cent) do not know how much IT downtime costs their business. This is down from over a third, (35%) In 2017.
Peter Groucutt, managing director of Databarracks, discusses these findings further:
“Evidence has historically suggested organisations struggle to consider costs, but we’re now seeing companies thinking much more broadly about the financial impact of IT-related downtime.”
Groucutt stresses having a complete picture of what IT downtime costs your organisation enables you to make better-informed decisions on issues relating to IT resilience, supplier management and continuity planning: “There are several types of indirect costs that need to be considered when estimating the financial impact of an outage to your organisation..
The obvious costs are well known to a business. They include staff wages, lost revenue or any costs tied to fixing an outage. It is important, however, to look beyond these for a more holistic view of the impact an outage will have financially. ‘Hidden’ or intangible costs, such as damage to reputation, can often outweigh the more obvious, immediate costs – further research from our Data Health Check study revealed reputational damage is the second biggest worry for organisations during a disaster, behind only revenue loss.
“The problem with these intangible costs is they aren’t easy to estimate, and because they often take time to materialise, they can be excluded from calculations. It will always be difficult to secure budget for IT resilience if you can’t show the board a clear picture of the impact downtime will have. Presenting a complete downtime cost immediately puts the cost of investment into context and will help IT departments make the improvements they need. It’s encouraging to see organisations thinking more holistically and factoring these previously ignored costs into their budgeting.”
Zerto has published the full findings of its sponsored IDC survey, Worldwide Business Resilience Readiness Thought Leadership Survey. The subsequent report revealed that 91% of respondents have experienced a tech-related disruption in the past two years, and yet 82% of respondents said data protection and recovery are important to their digital transformation projects.
The white paper illustrates a perception gap between IT and business decision makers regarding the importance of data availability and success of digital transformation/IT transformation initiatives.
Key indicators of perception gap, supported by research findings:
Optimizing resilience planning, the report says, plays an important role in minimizing the financial burden and negative impact of IT-related business disruption. These types of disruptions, the research details, are costing organizations significantly:
The white paper concludes that because most respondents have not optimized their IT resilience strategy, cloud and transformation initiatives are at risk of delay or failure.
However, 90% of respondents indicated intent to increase their IT resilience investments over the next two years.
For many organizations, efforts to improve resilience are taking place against a backdrop of changing data protection and disaster recovery needs:
Interestingly, almost 100% of respondents anticipate cloud playing a role in their organization's future disaster recovery or data protection plans. But today, according to respondents, integrated adoption of cloud-based protection solutions remains low:
Currently only 12.4% of IT budgets (on average) are spent on IT resilience hardware/software/cloud solutions.
Phil Goodwin, Research Director, IDC, commented:
“These survey results indicate that most respondents have not optimized their IT resilience strategy, evidenced by the high levels of IT and business-related disruptions. However, the majority of organizations surveyed will undertake a transformation, cloud, or modernization project within the next two years. This illustrates the need for all organizations to begin architecting a plan for IT resilience to ensure the success of these initiatives.”
He added, “Without such a plan, the high prevalence of disruptive events, unplanned downtime, and data loss indicated by respondents will continue to put cloud and transformation initiatives at risk of delay or failure — creating a financial burden and negative impact to an organization's competitive advantage.”
Avi Raichel, CIO, Zerto, added:
“The resilience of business IT is under constant pressure. Malicious attacks and outages are causing enormous levels of disruption, and it’s clear that for many organizations their ability to avoid and mitigate IT-related disruption is not where it needs to be, and is actually holding back their ability to focus on innovating. IT leaders and professionals clearly understand the pressing requirement for better resilience, and it’s to everyone’s benefit that the momentum behind IT resilience is really building.”
Demand for cloud computing continues to drive European data centre market.
Q2 2019 witnessed a record take-up of 57MW across the four largest colocation markets in Europe. The FLAP markets of Frankfurt, London, Amsterdam and Paris recorded 98MW at the half-year, 11MW above the previous H1 record according to research from CBRE, the world's leading real estate advisor.
CBRE analysis shows that Frankfurt recorded 44MW of take-up in the first half of the year and is set to beat London’s 2018 full-year total of 77MW, the current highest total for any individual market. Frankfurt’s take-up in H2 will be bolstered by pre-lets to some of the large new facilities set to launch in the market.
London’s subdued start to the year continued, with it seeing the lowest take-up of any of the FLAP markets in the first-half of 2019. CBRE attributes this to the hyperscale cloud companies having procured record capacity in London during 2018, with these companies now selling this capacity before they will need to acquire more.
As the demand for data centre capacity continues to grow, the constraints surrounding the availability of land and power in some areas is driving data centre developer-operators to create new sub-markets within the major FLAP cities. Concerns over space and power were a cause for the municipalities of Amsterdam and Haarlemmermeer to jointly put a temporary halt on the development of new data centres. This will not slow down the rate of growth in the Amsterdam market, but may drive developer-operators to new areas in the city.
Mitul Patel, Head of EMEA Data Centre Research at CBRE commented:
“There is no let-up to the extraordinary levels of activity in the European colocation sector. Take-up records are broken every quarter and hyperscale cloud companies continue to be the epicentre of this. As a consequence, winning hyperscale business is more competitive than ever and companies are competing on a number of criteria, including price. To develop its xScale hyperscale product Equinix has gained funding from Singapore’s Sovereign Wealth Fund, GIC.”
Fewer than a third of banks and telcos analyse applications before cloud migration.
CAST, a leader in Software Intelligence, has released its annual global cloud migration report. The report analyzes application modernization priorities in financial and telecommunications firms.
Findings show critical missteps mean cloud migrations are falling short of expectations in mature institutions, just 40% meeting targets for cost, resiliency and planned user benefits. Lack of pre-migration intelligence and fear of modernizing legacy mainframe applications are the main drivers for these shortcomings. Adoption of microservices as a modernization technique is also faltering from lack of financing.
While these legacy process institutions realise only third of their target benefits for cloud migration, cloud-native approaches are enabling FinTech firms to outperform traditional banks, achieving more than half their target benefits.
Fewer than 35% of technology leaders use freely-available analysis tools. There is a systematic failure to assess the underlying application readiness for cloud migration with Software Intelligence, a deep analysis of software architecture. IT leaders must ensure the right architectural model and compliance is in place to avoid increasing technical debt. Unchecked, this leads to more IT meltdowns such as TSB’s £330m re-platforming crisis in 2018, with customers paying the expensive price for these mistakes.
More than 50% of banks and telcos are effectively taking leaps of faith, not undertaking essential analysis-led evaluations to support and facilitate cloud migrations. Instead, half the CTOs surveyed use gut instinct and ad-hoc surveys with application owners as the primary basis of their decision to move applications to the cloud. IT leaders need to adopt an analysis-led approach over gut instinct to implement the right cloud migration strategy and realise all potential benefits of migrating to the cloud.
Greg Rivera, VP CAST Highlight at CAST, commented on the findings, “Pilots going into storms turn to their instruments. If you run headfirst into a cloud migration without objectively assessing your applications, you’re flying in the dark.
Even one small change to an application has a ‘butterfly effect’ on the rest of the code set, so a disruption as big as cloud migration has detrimental effects including IT outages and loss of business. Migration to the cloud is vital when digitally transforming a business. But, it needs to be done right if organizations want success instead of suffering.”
More than 40% of software leaders are yet to define a class based approach to application modernization. Heavily legacy process firms tend to rehost apps, while rehosting, or so-called ‘lift-and-shift’, benefits apps with up to three years before end of life. However, existing and continuously evolving apps should be re-platformed and restructured during cloud migration. To successfully complete migration first gather intelligence and actively assess applications objectively.
Armed with battle scars software leaders at banks and insurance firms are revisiting their initial ‘lift-and-shift’ approach to cloud migration plans. While FinTech firms outperform mature institutions on cloud-native apps, banks lead the way on cloud-ready applications with just fewer than 50% rewriting applications. A European Chief Digital Architect said, “Cloud migration is only really a problem if you’re moving workloads without changing the way they are shaped.”
New data from Synergy Research Group shows that 52 data center-oriented M&A deals closed in the first half of 2019, up 18% from the first half of 2018 and continuing a strong growth trend seen over the last four years. The number of deals closed in the first half exceeded the total amount closed in the whole of 2016. 2019 is in line to be another record year for data center M&A deal volume with eight more deals having closed since the beginning of July, 14 more that have been agreed upon with formal closure pending as well as the regular flow of M&A activity. In total, since the beginning of 2015 Synergy has now identified well over 300 closed deals with an aggregated value of over $65 billion. Acquisitions by public companies have accounted for 57% of the deal value, while private equity buyers have accounted for 53% of the deal volume.
In terms of deal value the story is a little different from deal count as the trend is skewed by a very small volume of huge multi-billion dollar acquisitions. Eleven such deals were closed during the 2017-2018 period, while 2019 has yet to see a multi-billion deal closure. Since 2015 the largest deals to be closed are the acquisition of DuPont Fabros by Digital Realty, the Equinix acquisition of Verizon’s data centers and the Equinix acquisition of Telecity. Over the 2015-2019 period, by far the largest investors have been Equinix and Digital Realty, the world’s two leading colocation providers. In aggregate they account for 36% of total deal value over the period. Other notable data center operators who have been serial acquirers include CyrusOne, Iron Mountain, Digital Bridge/DataBank, NTT and Carter Validus.
“Analysis of data center M&A activity helps to affirm some clear trends in the industry, not least of which is that enterprises increasingly do not want to own or operate their own data centers,” said John Dinsdale, a Chief Analyst at Synergy Research Group. “As enterprises either shift workloads to cloud providers or use colocation facilities to house their IT infrastructure, more and more data centers are being put up for sale. This in turn is driving change in the colocation market, with industry giants on a never-ending quest to grow their global footprint and a constant ebb and flow of ownership among small local players. We expect to see a lot more data center M&A over the next five years.”
The 29 must-watch technologies on the Gartner Inc. Hype Cycle for Emerging Technologies, 2019 revealed five distinct emerging technology trends that create and enable new experiences, leveraging artificial intelligence (AI) and other constructs to enable organizations to take advantage of emerging digital ecosystems.
“Technology innovation has become the key to competitive differentiation. The pace of change in technology continues to accelerate as breakthrough technologies are continually challenging even the most innovative business and technology decision makers to keep up,” said Brian Burke, research vice president at Gartner. “Technology innovation leaders should use the innovation profiles highlighted in the Hype Cycle to assess the potential business opportunities of emerging technologies.”
The Hype Cycle for Emerging Technologies is unique among most Gartner Hype Cycles because it garners insights from more than 2,000 technologies into a succinct set of 29 emerging technologies and trends. This Hype Cycle specifically focuses on the set of technologies that show promise in delivering a high degree of competitive advantage over the next five to 10 years (see Figure 1).
Figure 1. Hype Cycle for Emerging Technologies, 2019
Source: Gartner (August 2019)
Five Emerging Technology Trends
Sensing and Mobility
By combining sensor technologies with AI, machines are gaining a better understanding of the world around them, enabling mobility and manipulation of objects. Sensing technologies are a core component of the Internet of Things (IoT) and the vast amounts of data collected. Utilizing intelligence enables the ability to gain many types of insights that can be applied to many scenarios.
For example, over the next decade AR cloud will create a 3D map of the world, enabling new interaction models and in turn new business models that will monetize physical space.
Enterprises that are seeking leverage sensing and mobility capabilities should consider the following technologies: 3D-sensing cameras, AR cloud, light-cargo delivery drones, flying autonomous vehicles and autonomous driving Levels 4 and 5.
Augmented Human
Augmented human advances enable creation of cognitive and physical improvements as an integral part of the human body. An example of this is the ability to provide superhuman capabilities such as the creation of limb prosthetics with characteristics that can exceed the highest natural human performance.
Emerging technologies focused on extending humans includes biochips, personification, augmented intelligence, emotion AI, immersive workspaces and biotech (cultured or artificial tissue).
Postclassical Compute and Comms
For decades, classical core computing, communication and integration technologies have made significant advances largely through improvements in traditional architectures — faster CPUs, denser memory and increasing throughput as predicted by Moore’s Law. The next generations of these technologies adopt entirely new architectures. This category includes not only entirely new approaches, but also incremental improvements that have potentially dramatic impacts.
For example, low earth orbit (LEO) satellites can provide low latency internet connectivity globally. These constellations of small satellites will enable connectivity for the 48% of homes that are currently not connected, providing new opportunities for economic growth for unserved countries and regions. “With only a few satellites launched, the technology is still in its infancy, but over the next few years it has the potential for a dramatic social and commercial impact” said Mr. Burke.
Enterprises should evaluate technologies such as 5G, next-generation memory, LEO systems and nanoscale 3D printing.
Digital Ecosystems
Digital ecosystems leverage an interdependent group of actors (enterprises, people and things) sharing digital platforms to achieve a mutually beneficial purpose. Digitalization has facilitated the deconstruction of classical value chains, leading to stronger, more flexible and resilient webs of value delivery that are constantly morphing to create new improved products and services.
Critical technologies to be considered include: DigitalOps, knowledge graphs, synthetic data, decentralized web and decentralized autonomous organizations.
Advanced AI and Analytics
Advanced analytics comprises the autonomous or semiautonomous examination of data or content using sophisticated techniques and tools, typically beyond those of traditional business intelligence (BI).
“The adoption of edge AI is increasing for applications that are latency-sensitive (e.g., autonomous navigation), subject to network interruptions (e.g., remote monitoring, natural language processing [NLP], facial recognition) and/or are data-intensive (e.g., video analytics),” said Mr. Burke.
The technologies to track include adaptive machine learning (ML), edge AI, edge analytics, explainable AI, AI platform as a service (PaaS), transfer learning, generative adversarial networks and graph analytics.
This year, Gartner refocused the Hype Cycle for Emerging Technologies to shift toward introducing new technologies that have not been previously highlighted in past iterations of this Hype Cycle. While this necessitates retiring most of the technologies that were highlighted in the 2018 version, it does not mean that those technologies have ceased to be important.
5.8 billion enterprise and automotive IoT endpoints will be in use in 2020
Gartner, Inc. forecasts that the enterprise and automotive Internet of Things (IoT) market* will grow to 5.8 billion endpoints in 2020, a 21% increase from 2019. By the end of 2019, 4.8 billion endpoints are expected to be in use, up 21.5% from 2018.
Utilities will be the highest user of IoT endpoints, totaling 1.17 billion endpoints in 2019, and increasing 17% in 2020 to reach 1.37 billion endpoints. “Electricity smart metering, both residential and commercial will boost the adoption of IoT among utilities,” said Peter Middleton, senior research director at Gartner. “Physical security, where building intruder detection and indoor surveillance use cases will drive volume, will be the second largest user of IoT endpoints in 2020.”
Building automation, driven by connected lighting devices, will be the segment with the largest growth rate in 2020 (42%), followed by automotive and healthcare, which are forecast to grow 31% and 29% in 2020, respectively (see Table 1). In healthcare, chronic condition monitoring will drive the most IoT endpoints, while in automotive, cars with embedded IoT connectivity will be supplemented by a range of add-on devices to accomplish specific tasks, such as fleet management.
Table 1
IoT Endpoint Market by Segment, 2018-2020, Worldwide (Installed Base, Billions of Units)
Segment | 2018 | 2019 | 2020 |
Utilities | 0.98 | 1.17 | 1.37 |
Government | 0.40 | 0.53 | 0.70 |
Building Automation | 0.23 | 0.31 | 0.44 |
Physical Security | 0.83 | 0.95 | 1.09 |
Manufacturing & Natural Resources | 0.33 | 0.40 | 0.49 |
Automotive | 0.27 | 0.36 | 0.47 |
Healthcare Providers | 0.21 | 0.28 | 0.36 |
Retail & Wholesale Trade | 0.29 | 0.36 | 0.44 |
Information | 0.37 | 0.37 | 0.37 |
Transportation | 0.06 | 0.07 | 0.08 |
Total | 3.96 | 4.81 | 5.81 |
Source: Gartner (August 2019)
Top Use-Case Opportunities Vary by Region
Similar to 2019, residential electricity smart metering, which can be used for more accurate metering and billing in the home, will be the top use case for Greater China and Western Europe in 2020, and will represent 26% and 12% of total IoT endpoints, respectively. North America, in comparison, will see its highest IoT endpoint adoption in building intruder detection, such as door and window sensors, which will represent 8% of total IoT endpoints.
North America and Greater China Have the Biggest Market for Endpoint Electronics Revenue
In 2020, revenue from endpoint electronics will total $389 billion globally and will be concentrated over three regions: North America, Greater China and Western Europe. These three regions will represent 75% of the overall endpoint electronics revenue. North America will record $120 billion, Great China will achieve $91 billion and Western Europe will come in third totaling $82 billion in 2020.
In 2020, the two use cases that will produce the most endpoint electronics revenue will be consumer connected cars and networkable printing and photocopying, totaling $72 billion and $38 billion, respectively. Connected cars will retain a significant portion of the total endpoint electronics spending resulting from increasing electronics complexity and manufacturers implementing connectivity in a greater percentage of their vehicle production moving forward. While printers and photocopiers will contribute significant spending in 2020, the market will decline slowly and other use cases such as indoor surveillance will rise as governments focus on public safety.
“Overall, end users will need to prepare to address an environment where the business units will increasingly buy IoT-enabled assets without policies for support, data ownership or integration into existing business applications,” said Alfonso Velosa, research vice president at Gartner. This will require the CIO’s team to start developing a policy and architecture-based approach to support business units’ objectives, while protecting the organization from data threats.
“Product managers will need to deliver but also to clearly and loudly communicate their IoT-based business value to specific verticals and their business processes, if they are to succeed in this crowded arena,” concluded Mr. Velosa.
Worldwide 5G network infrastructure revenue to reach $4.2 billion in 2020
In 2020, worldwide 5G wireless network infrastructure revenue will reach $4.2 billion, an 89% increase from 2019 revenue of $2.2 billion, according to Gartner, Inc.
Additionally, Gartner forecasts that investments in 5G NR network infrastructure will account for 6% of the total wireless infrastructure revenue of communications service providers (CSPs) in 2019, and that this figure will reach 12% in 2020 (see Table 1).
“5G wireless network infrastructure revenue will nearly double between 2019 and 2020,” said Sylvain Fabre, senior research director at Gartner. “For 5G deployments in 2019, CSPs are using non-stand-alone technology. This enables them to introduce 5G services that run more quickly, as 5G New Radio (NR) equipment can be rolled out alongside existing 4G core network infrastructure.”
In 2020, CSPs will roll out stand-alone 5G technology, which will require 5G NR equipment and a 5G core network. This will lower costs for CSPs and improve performance for users.
Table 1: Wireless Infrastructure Revenue Forecast, Worldwide, 2018-2021 (Millions of Dollars)
Segment | 2018 | 2019 | 2020 | 2021 |
5G 2G 3G LTE and 4G Small Cells Mobile Core | 612.9 1,503.1 5,578.4 20,454.7 4,785.6 4,599.0 | 2,211.4 697.5 3,694.0 19,322.4 5,378.4 4,621.0 | 4,176.0 406.5 2,464.3 18,278.2 5,858.1 4,787.3 | 6,805.6 285.2 1,558.0 16,352.7 6,473.1 5,009.5 |
Total | 37,533.6 | 35,924.7 | 35,970.5 | 36,484.1 |
Due to rounding, figures may not add up precisely to the totals shown.
Source: Gartner (August 2019)
5G Rollout Will Accelerate Through 2020
5G services will launch in many major cities in 2019 and 2020. Services have already begun in the U.S., South Korea and some European countries, including Switzerland, Finland and the U.K. CSPs in Canada, France, Germany, Hong Kong, Spain, Sweden, Qatar and the United Arab Emirates have announced plans to accelerate 5G network building through 2020.
As a result, Gartner estimates that 7% of CSPs worldwide have already deployed 5G infrastructure in their networks.
CSPs Will Increasingly Aim 5G Services at Enterprises
Although consumers represent the main segment driving 5G development, CSPs will increasingly aim 5G services at enterprises. 5G networks are expected to expand the mobile ecosystem to cover new industries, such as the smart factory, autonomous transportation, remote healthcare, agriculture and retail sectors, as well as enable private networks for industrial users.
Equipment vendors view private networks for industrial users as a market segment with significant potential. “It’s still early days for the 5G private-network opportunity, but vendors, regulators and standards bodies have preparations in place,” said Mr. Fabre. Germany has set aside the 3.7GHz band for private networks, and Japan is reserving the 4.5GHz and 28GHz for the same. Ericsson aims to deliver solutions via CSPs in order to build private networks with high levels of reliability and performance and secure communications. Nokia has developed a portfolio to enable large industrial organizations to invest directly in their own private networks.
“National 5G coverage will not occur as quickly as with past generations of wireless infrastructure,” said Mr. Fabre. “To maintain average performance standards as 5G is built out, CSPs will need to undertake targeted strategic improvements to their 4G legacy layer, by upgrading 4G infrastructure around 5G areas of coverage. A less robust 4G legacy layer adjoining 5G cells could lead to real or perceived performance issues as users move from 5G to 4G/LTE Advanced Pro. This issue will be most pronounced from 2019 through 2021, a period when 5G coverage will be focused on hot spots and areas of high population density.
AI Augmentation to create $2.9 trillion of business value in 2021
In 2021, artificial intelligence (AI) augmentation will create $2.9 trillion of business value and 6.2 billion hours of worker productivity globally, according to Gartner, Inc.
Gartner defines augmented intelligence as a human-centered partnership model of people and AI working together to enhance cognitive performance. This includes learning, decision making and new experiences.
“Augmented intelligence is all about people taking advantage of AI,” said Svetlana Sicular, research vice president at Gartner. “As AI technology evolves, the combined human and AI capabilities that augmented intelligence allows will deliver the greatest benefits to enterprises.”
Business Value of Augmented Intelligence
Gartner’s AI business value forecast highlights decision support/augmentation as the largest type of AI by business value-add with the fewest early barriers to adoption (see Figure 1). By 2030, decision support/augmentation will surpass all other types of AI initiatives to account for 44% of the global AI-derived business value.
Figure 1: Worldwide Business Value by AI Type (Millions of Dollars)
Source: Gartner (August 2019)
Augmented Intelligence Enhances Customer Experience
Customer experience is the primary source of AI-derived business value, according to the Gartner AI business value forecast. Augmented intelligence reduces mistakes while delivering customer convenience and personalization at scale, democratizing what was previously available to the select few. “The goal is to be more efficient with automation, while complementing it with a human touch and common sense to manage the risks of decision automation,” said Ms. Sicular.
“The excitement about AI tools, services and algorithms misses a crucial point: The goal of AI should be to empower humans to be better, smarter and happier, not to create a ‘machine world’ for its own sake,” said Ms. Sicular. “Augmented intelligence is a design approach to winning with AI, and it assists machines and people alike to perform at their best.”
Despite a slowing global economy and the looming trade war between the United States and China, purchases of information and communications technology (ICT) are expected to maintain steady growth over the next five years. A new forecast from International Data Corporation (IDC) predicts worldwide ICT spending on hardware, software, services, and telecommunications will achieve a compound annual growth rate (CAGR) of 3.8% over the 2019-2023 forecast period, reaching $4.8 trillion in 2023.
"Global market conditions remain volatile, and although the economy has performed broadly better than expected in the past six months in many countries, a sense of uncertainty over the short-term economic and business outlook has been rising at the same time," said Serena Da Rold, program manager in IDC's Customer Insights and Analysis group. "Confidence indicators are fluctuating on a monthly basis, depending on short-term indicators ranging from speculation over tariffs and trade wars to political wild cards, with a potential global slowdown looming for 2019 and 2020. End-user surveys reflect the impact of this uncertainty on business decision-making, but our forecasts remain roughly stable overall for 2019 compared with our previous release, and slightly accelerated in the medium term, driven by stronger growth in software and hardware. Digital transformation and the adoption of automation technologies will be driving investments in applications, analytics, middleware, and data management software, as well as increasing demand for server and storage capacity."
Commercial purchases will account for nearly two thirds of all ICT spending by 2023, up from 60.4% in 2018 and growing at a solid five-year CAGR of 5.1%. Banking and discrete manufacturing will be the industries spending the most on ICT over the forecast period followed by professional services, which will also see the fastest growth in ICT spending, driven largely by service provider spending. Media and personal and consumer services will also grow nicely as these companies transform their businesses to offer new services and improve customer experience.
While purchases for planned upgrades and refresh cycles will continue to be the largest driver of commercial ICT spending, new investments in the technologies and services that enable the digital transformation (DX) of business models, products and services, and organizations will be a significant source of spending. IDC recently forecast worldwide DX spending to reach $1.18 trillion in 2019.
Consumer ICT spending will grow at a much slower rate (1.5% CAGR) resulting in a gradual loss of share over the five-year forecast period. Consumer spending will be dominated by purchases of mobile telecom services and devices (such as smartphones, notebooks, and tablets).
The United States will be the largest geographic market with ICT spending forecast to reach $1.66 trillion in 2023. Western Europe will be the second largest region with $927 billion in ICT spending in 2023, followed by China at $618 billion. China will also be the fastest growing region with a five-year CAGR of 6.1%.
"In the U.S., a favorable business climate and strong consumer confidence continues to buoy technology spending and innovative projects. Tech-intense areas such as the financial services sector and telecom industry are holding strong as they are committed to serving their demanding and evolving customers in new and innovative ways," said Jessica Goepfert, vice president in IDC's Customer Insights and Analysis group. "While the spending is more fragmented, consumer-facing industries like retail and personal and consumer services are also continuing to enjoy the benefits of healthy consumer confidence and higher wages and disposable incomes, and we see investments to develop and deliver an unforgettable customer experience and boosting customer loyalty. We continue to monitor the impact of the tariffs and trade wars on the manufacturing sector where are still bright spots, namely around projects that enable the efficient utilization of fixed assets while maximizing capacity utilization."
"Digital transformation is catching up in Asia/Pacific at an accelerated pace, and this will continue to drive significant investments in technologies in the next few years – from hardware and services to applications. The investments are driven by both government and enterprises in the region as they are understanding the value of what these new technologies bring to the overall operational activities. It also harnesses the potential of a lot of initiatives being launched to make the workforce well versed. Upskilling and future-proofing the workforce are on top of employers' and the governments' agenda," said Ashutosh Bisht, senior research manager with IDC's Customer Insights and Analysis group.
Worldwide spending on customer experience (CX) technologies will total $508 billion in 2019, an increase of 7.9% over 2018, according to the inaugural Worldwide Semiannual Customer Experience Spending Guide from International Data Corporation (IDC). As companies focus on meeting the expectations of customers and providing a differentiated customer experience, IDC expects CX spending to achieve a compound annual growth rate (CAGR) of 8.2% over the 2018-2022 forecast period, reaching $641 billion in 2022.
IDC defines customer experience (CX) as a functional activity encompassing business processes, strategies, technologies, and services that companies use, irrespective of industry, to provide a better experience for their customer and to differentiate themselves from their competitors. The term customer refers to individuals (B2C) as well as groups (B2B). IDC focuses only on business process and therefore does not include the customer's experience of the actual design of the product that the company sold to the customer, nor does it include aspects specific to the product or service such as the user interface or the product aesthetics.
"Customer experience has become a key differentiator for businesses worldwide. New innovation accelerator technologies like artificial intelligence and data analytics are at the forefront in driving the differentiation for businesses to succeed in their customer experience strategic initiatives," said Craig Simpson, research manager, Customer Insights & Analysis.
CX spending will be distributed somewhat evenly across the 16 use cases identified by IDC. In fact, the top six use cases will account for less than one third of overall spending this year. The CX use case that will see the most spending in 2019 and throughout the forecast is customer care and support followed by order fulfillment and interaction management. The use cases that will see the fastest spending growth over the five-year forecast period are AI-driven engagement, interaction management, and ubiquitous commerce.
The retail industry will spend the most on CX technologies in 2019 ($56.7 billion) and throughout the forecast. Digital marketing, AI-driven engagement, and order fulfillment will be the use cases that receive the most funding from retail organizations. Discrete manufacturing and banking will be the second and third largest industries in 2019. Customer care and support will be the primary use case for both industries. Retail and healthcare will be the two industries with the fastest spending growth over the forecast period with CAGRs of 13.1% and 11.5% respectively.
From a technology perspective, services will be the largest area of CX spending at $220 billion in 2019. Most of this total will be divided between business services and IT services. Software will be the second largest area of CX technology spending led by CRM applications and content applications. Hardware, including infrastructure and devices, will account for nearly 20% of overall CX spending while telecommunications services will be less than 10% of total spending.
The United States will be the largest geographic market for CX spending in 2019 led by the discrete manufacturing and retail industries. Western Europe will be the second largest region with banking and retail as the top industries. The third largest market will be China, led by healthcare and retail CX spending. China will also see the fastest growth in CX spending with a five-year CAGR of 13.6%.
International Data Corporation (IDC) completed its seventh annual survey examining the latest investment trends in the Internet of Things (IoT) as well as the opportunities and challenges facing IoT buyers worldwide. IDC found the majority of organizations that have deployed IoT projects have determined the KPIs to measure success – however, the specific KPIs differ significantly by industry.
Respondents of IDC's Global IoT Decision Maker Survey include IT and line of business decision makers (director and above) from 29 countries across six industries that have invested or plan to invest in IoT projects. Some key findings include:
An International Data Corporation (IDC) special report on developers, DevOps professionals, IT decision makers, and line of business executives found that developers have significant autonomy with respect to the selection of developer tools and technologies. In addition, developers exercise influence over enterprise purchasing decisions and should be viewed as key stakeholders in IT purchasing and procurement within any organization undergoing a movement to cloud accompanied by an internal digital transformation.
"The autonomy and influence enjoyed by developers today is illustrative of the changing role of developers in enterprise IT in an era of rapidly intensifying digital transformation," said Arnal Dayaratna, research director, Software Development at IDC. "Developers are increasingly regarded as visionaries and architects of digital transformation as opposed to executors of a pre-defined plan delivered by centralized IT leadership."
The study, based on a global survey of 2,500 developers, also found that the contemporary landscape of software development languages and frameworks remains highly fragmented, which creates a range of challenges for developer teams as well as potentially significant implications for the long-term support of applications built today. Given this environment, the languages that are likely to continue gaining traction among developers are those that support a variety of use cases and deployment environments, such as Python and Java, or exhibit specializations that differentiate them from other languages, as exemplified by JavaScript, along with readily available skills as staffing needs expand.
Other key findings from IDC's PaaSView survey include the following:
"Developer interest in DevOps reflects a broader interest in transparency and collaboration that illustrates the trend in software development to not only use open source technologies, but also to integrate open source practices into software development," said Al Gillen, group vice president, Software Development and Open Source at IDC. "Developers prioritize decentralized collaboration and code contributions as well as transparent documentation of the reasoning for code-related decisions."
European blockchain spending to grow to $4.9 billion by 2023
IDC's new Worldwide Semiannual Blockchain Spending Guide predicts continuous growth in blockchain spending across Europe, from over $800 million in 2019 to $4.9 billion in 2023, growing at a CAGR of 65.1% between 2018 and 2023. Despite a smaller CAGR compared with the average growth forecast for 2017–2022, knowledge about opportunities in blockchain is spreading from big enterprises to emerging start-ups looking to ensure secure and reliable management of money, personal data, and assets, which is proving to be necessary in a digitalized and data-driven market.
The perception of blockchain in the European market (and worldwide) is moving away from being only a cryptocurrency tool and a financial-only technology. Blockchain originated in the banking industry, and banking still accounts for 31% of total spending in 2019, with cross-border payments and trade finance the fastest-growing banking use cases. With blockchain well established in banking, other industries are increasingly stepping up and taking part in the digital transformation.
Manufacturing, professional services, retail, and banking show the greatest promise in their future investment in blockchain with above average CAGRs, hoping to improve transparency and assured authenticity in their businesses.
New use cases are emerging, driven by growing awareness of what blockchain is and what it can and cannot do. Blockchain enables companies to cut out the middleman, thereby saving costs and reducing risks of fraudulent behavior and human error. Identity management is a new use case on the rise in Europe, being implemented in insurance, banking, government, and personal and consumer services. With data emerging as one of the most valuable resources, it is becoming crucial for companies to have an effective and safe way to store, secure, and use consumers' personal data. Blockchain offers a decentralized and encrypted system to do this and it is now being used for a range of purposes, including i-voting (internet/electronic voting) and intellectual property management.
"Companies are beginning to view blockchain not only for its cryptographic means, but rather as a management tool that can keep track of items, information, and customer data. This is something that can be used by the transportation industry to track shipments, by top-quality luxury goods retailers to track provenance, or by real-estate professionals for transparent property management," said Carla La Croce, senior research analyst, Customer Insights and Analysis, IDC. "As spending continues to grow, the market will most likely adapt and security and validity will become a customer standard, with more companies turning to blockchain for a safe and reliable solution."
There are some challenges, however, according to Mohamed Hefny, program manager, Systems and Infrastructure Solutions, IDC. These include the lack of rules and regulations, and getting small producers of goods, like farmers and fishermen, to use the enterprise platforms. "However, the recognition from the European Commission of blockchain's importance to a single digital market and the work done by the big cloud blockchain service providers to create simple mobile apps for their smaller consortium members, and similar developments, are very promising," he said.
One of the key features of the managed services model which appeals to both end-user customers and MSPs themselves is the way services can be added to, augmented or very occasionally reduced. This portfolio approach to services has a lot to recommend it, with the technology being delivered, integrated and updated by the experts among the vendors and channels, but one of the side effects is that these services need not all be supplied by the same provider, and some players are only just realising this.
While there is concern among MSPs that this approach might dilute their relationship with the customer, at the same time, they acknowledge that they cannot be expert in every service they can deliver; hence the rise of the partner ecosystem.
This is especially noticeable in security, where the rise of the managed security service supplier (MSSP) has been noticeable in the last year. According to the Kaspersky IT Security Risks Survey 2019, 59% of organisations plan to use an MSP in the “near future” to help them reduce their security-related costs, while 43% of businesses value the “dedicated expertise” that comes with outsourcing their IT security support.
In other areas as well, expertise is in short supply. Internet of Things (IoT) is just at the stage where meaningful, repeatable and reliable solutions are becoming available for channels. As distributor Arrow’s EMEA IoT chief Paul Karrer says, “Data acquisition needs to be done before any analytics can start. And then the partners doing the data acquisition are not necessarily the ones to do the analytics.”
So there is a big opportunity for partners (and distributors and technology partners) to work together; Paul Karrer certainly sees more partners specialising and working as part of a team.
Also in the ecosystem there are network specialists, storage experts, even channels specialising in marketing and engaging with customers’ own data to talk to the outside world. These are all areas of expertise that require investment and commitment both to enter and then sustain with the expert level of knowledge required. It is no surprise that many MSPs are realising that they cannot do everything, and are looking to partner.
This is at the heart of the rising trend for ecosystems of managed service providers.
Acting as a brake, however is the tangle of responsibilities and contracts linked to service supply, including, but not limited to, Service Level Agreements (SLAs). Many of the current managed services deals have not been tested in adversity; the measurement of service issues and the implications of raised customer expectations are yet to be revealed in any scale.
The managed services industry will need to take these on board. The multiple layers of supply in the IT industry go beyond managed services, as many are now recognizing. As Tech Data’s European SVP Miriam Murphy told this year’s EMEA GTDC conference for major distributors in Lisbon, there needs to be wider changes in distribution contracts to reflect digitalisation and a new spread of responsibilities in the supply chain. Services are a key part of that.
Building the partner ecosystem around services and finding different sorts of partner will be one of the debating points as this year’s Managed Services Summits in London on September 18 and in Manchester on October 30.
The Managed Services & Hosting Summits are firmly established as the leading Managed Services event for the channel. Now in its ninth year, the London Managed Services & Hosting Summit 2019 on September 18 aims to provide insights into how managed services continues to grow and change as customer demands expand suppliers into a strategic advisory role, and as the ecosystems grow. The Managed Services Summit North, in Manchester will look at how regional expertise is developing in managed services, with local provision through a partner ecosystem.
DW talks to Mike Rivers, Product Director at GTT, about disrupting the established telecoms market, as the company seeks to help end users ensure that the network part of the overall IT stack, is an enabler rather than a bottleneck when it comes to digital transformation.
1. Please can you provide a brief background on the company?
GTT connects people across organisations, around the world and to every application in the cloud. We aim to deliver an outstanding experience to clients through our core values of simplicity, speed and agility. GTT operates a global network and provides a comprehensive suite of cloud networking services to any location in the world.
2. And what have been the major milestones to date?
Through a series of strategic mergers and acquisitions, we have developed a Tier 1 IP network, among the largest in the industry, with over 600 PoPs spanning six continents and service presence in more than 140 countries. In addition to building our global network footprint via M&A actvity, our acquisitions have deepened our product portfolio, enabling us to add clients, and to contribute world class sales talent. For example, our acquisition of Hibernia in 2017 added transatlantic fibre network to our portfolio. Last year’s acquisition of Interoute added significant scale, deepending our local presence throughout Europe and a pan European fibre network.
3. And how does GTT distinguish itself in what’s a very busy market?
Our opportunity lies in disrupting the established telecoms market place. As a company we don’t have legacy services to protect and we also have the flexibility and entrepreneurial spirit to adopt innovative new technologies to win business and gain market share. Our industry leading software defined wide area networking (SD-WAN) service is an example such innovation that brings substantial benefits to our clients in terms of improved network efficiency and application performance
GTT is also exclusively focused on the B2B market, which contrasts with many of the larger incumbent providers that have shifted attention away from serving the enterprise market towards consumer, mobile, and media content-related businesses.
Our global Tier 1 IP backbone also sets us apart. It securely connects client locations to any destination on the internet or to any cloud service provider. We offer the widest range of access options with bundled network security, making it simple and cost-effective to integrate new locations and add network bandwidth as needed.
Finally we strive to be a company that’s easy to do business with built on our core values of simplicity, speed and agility. We think this truly sets us apart, and it’s one of the many reasons that we’ve grown so rapidly.
4. Please can you provide a brief overview of the GTT portfolio?
GTT serves national, international and global enterprises, governments and universities, as well as the world’s largest telecoms operators and OTT providers with a portfolio of network services. These services broadly fall into two catagories.
The first is our managed services products which tend to focus on supporting enterprise clients. We offer a comprehensive suite of cloud networking services, including wide-area networking (SD-WAN, Ethernet, VPLS, MPLS), and Internet (IP transit, DIA, managed broadband). We also provide managed security and unifed communications (Cloud UC and SIP Trunking)
The second category is is our network services business, this is the transport and infrastructure (wavelength, dark fiber, Ethernet, colocation) play for large scale businesses such as carriers, government organisations and OTT household brands.
5. And the GTT managed services?
We deliver cloud networking services to multinational clients. The majority want a global network solution that can provide them with all of their connectivity needs so part of the value we can deliver is managed network services that support the full management of critical IT capabilities. Everything from network design, implementation, management and monitoring to gathering quotes from our access partners. This is typically something clients value as it removes complexity, and allows them to focus their scarce IT resources on core business requirements.
6. In more detail, can you talk us through the transport and infrastructure services?
We operate one of the largest, most advanced fibre networks in Europe connecting 21 countries. As part of our suite of cloud networking services, we sell high capacity wavelengths, to some of the largest cloud providers in the world. Ethernet is another technology that we frequently sell for cloud connectivity and low-latency, point-to-point connections. We also sell dark fibre and colocation.
7. And your WAN offering?
From VPLS, and MPLS to SD-WAN – we offer multiple flavors of WAN. However, the type of WAN that we offer to each client depends on individual need.
In the last few years, we’ve seen a significant uptake in SD-WAN, which is a point of differentiation for us. SD-WAN is a much more advanced and intelligent than traditionala networks. It leverages the GTT internet backbone at both the technical and the service level. SD-WAN combines the power of the internet with greater use of software so that the network can adapt to the demands being placed on it. Using capabilities like overlay networking, data analysis and analytics, we can offer clients agile methods to get traffic to each location and applications in a cost-efficient manner. SD-WAN, coupled with our Tier 1 global IP backbone, provides advantages for cloud application performance and simplifies cloud adoption.
8. And then there’s voice?
This is a classic networking product. We can provide a suite of unified coomunications services that offer global reach and improve productivity. . Whether businesses are migrating to Cloud UC or connecting voice infrastructure over SIP, GTT offers both. Our Voice services help organisations take advantage of the advanced functionality and cost efficiencies of cloud-based service delivery, such as hosted PBX solutions based on soft client technology, backed by the global reach of our SIP-based voice network.
9. And, finally, Internet?
GTT is one of the top five internet backbones in the world.
We provide a fast and reliable internet experience. We own and operate AS3257, one of the top ranked Tier 1 IP networks in the world, providing the scalability and network reach that organisations need to connect globally.
We reach roughly a third of all internet destinations and have direct connectivity with every major cloud provider sitting across the backbone. This provides a real advantage for our clients. It means we can provide performance guarantees of the connectivity for applications and also means that the cost base is lower than that offered by our competitors.
10. Focusing in on the WAN offering in more detail, how would you say that the SD-WAN market has developed up to now?
The SD-WAN market has seen a lot of developments in recent years, particularly with new entrants emerging from both the managed service providers and vendor ecosystems. For clients, the question is no longer “if” but “when”. However, because many businesses are still tied into legacy WAN contracts, not all enterprises can move to SD-WAN straight away. What we’re seeing is that these clients are exploring SD-WAN’s capabilities, although the early adoptors are already deploying SD-WAN for full production. Thanks to well developed relationships with over 300 local access partners and one of the largest global tier-one IP networks in the industry we’re well placed to help guide clients, so they benefit from the full potential of SD-WAN technologies.
11. For example, SD-WAN has a major role to play in a multi-cloud environment?
As enterprises increasingly adopt a multi-cloud approach then they will need a flexible, agile network. SD-WAN isn’t a static network environment like a traditional WAN. Instead, it has the ability to make decisions to benefit the performance of a specific application. Being able to route traffic over the best available path, businesses are equipped with the agility to optimise the network and prioritise mission-critical or latency-sensitive applications like video and voice.
12. More generally, smarter and faster networks are needed to respond to the demands of modern business?
Absolutely. The network is a key foundation and enabler of IT. Within the world of IT, there are three key areas:
1. Procesing data in an application
2. Storing data
3. Moving data between client and a server – or between applications
The movement of data is the area in which the network comes into play. Given the huge dispruption and changes to the way we process and store data, we need more dynamic networking environments that adapt to the needs of the applications.
13. Specifically, what networking changes can we expect to see in terms of both topologies and speeds in order to meet the demands of the digital world?
Over the next few years, we are going to see the widespread roll out of 5G which will provide an additionala local access technology alongside existing infrarstructure options. Much more fibre will be also delivered to the premises as wireless technologies still can’t compete. The capacity of fibre based networks today, is vastly better than just a few years ago. This isn’t because the physical cable itself has changed, but, because innovation in optical transmission technologies has enabled the ability to move more data over a single fiber strand .
We’re likely to see more flexible bandwidth, adapting to demand. These changes will be facilitated by the client or even automated. For example, if an application demands an extra gig of bandwidth, the network can dynamically scale to facilitate this.
14. 5G, IoT, AI and other intelligent automation technologies will cause a transformation of the networking infrastructure that supports them?
Yes, IoT, AI and edge computing are already starting to provoke a need to transform networking infrastructure. This is whay we’re seeing the growth of SD-WAN and the useof data driven, intelligent technologies.
Machine learning means that machines can learn and adapt based on the current performance of the network and the demands made on them and lots of this type of automation technology is being integrated into the networks.
Not only this, but the migration to an applications based workplace is also drivig the need for more agile and progamabke networks like SD-WAN.
15. More generally, does the dreaded B-word (Brexit) have any impact on end users and their networking requirements and/or capabilities?
Nobody knows exactly how Brexit will affect their business. What is certain is that organisations need robust networks for international operations – now more than ever. UK businesses need to ensure they maintain a strong connection across the globe – and networks are a certainty GTT can guarantee.
16. Resilience has always been important, as has latency, but, moving forwards, do you see these becoming essentials, rather than nice to haves, for all businesses?
Resilience is the most essential component. In the past, enterprises used to host everything in a local data centre connected to branch offices with a simple network architecture. However, the network requirementns have become more complex.
The network is the unadation for enabling secure, scalable and efficient use of cloud and edge basedapplictions and devices. Today’s businesses need to ensure they have a resilient network, built to survive failures and one that ensures traffic takes the most direct, lowest-latency path available to mission-critical applications.
17. End users face a wealth of connectivity choices. Can you outline these choices and offer some advice as to how end users can make some sense of how to put together a network that best meets their business’s needs?
When talking about end users who have responsibility for network architecture within an enterprise, we talk about performance, resiliency and cost. Enterprise users are focused on application performance. It’s no longer as simple as just being connected in the office. It’s about the people in that office being connected to each other and to the applictaions they need to run their business.And what risks are associated with those applications?
We advise our clients to ensure they have at least two access options in place for each location set up in such a way that either can handle the full traffic load for that location in case of disruption. Ultimately the access choices you make are driven by cost vs risk to the business.
18. Are you able to share one or two customer success stories with us?
A great success story I’d like to draw upon here is our work with UEFA. GTT manages the ICT infrastructure that supports the UEFA.com website and the UEFA Football Administration Management Environment (FAME), in addition to the ICT infrastructure that is used to manage tournaments such as the UEFA Champions League. We provide an agile SLA model that enables SLA levels to be calibrated to the specific requirements of a competition or event. This dynamic SLA has helped UEFA balance the need for guaranteed 100% availability during live matches with more standard uptime requirements for back office systems during business as usual times.
To give a different example, we’ve also been working with Carglass – a leading car glass manufacturing and repair company. We’ve worked to provide them with high-speed connectivity for its 450 locations which includes fibre links and back-up links for 250 repair centres plus 4G wireless for 180 repair centres. Our network solution supports Carglass’s supply chain and work flow, enabling technicians to access applications hosted in the cloud to work in real time from any location.
This issue of Digitalisation World includes a major focus on augmented and virtual reality technologies, with particular reference as to how they already are, and will be, impacting the enterprise world. The feature includes a couple of articles on how AR and VR are likely to find a role within the business world, along with a range of expert comments as to how and where AR and VR might just be making a difference in the workplace, both now and into the future. Part 1.
James Watson, Chief Marketing Officer at Immerse, explains how virtual reality is revolutionising corporate training
Long-hailed as the “next big thing”, virtual reality (VR) gaming is gaining meaningful traction in the consumer market, as equipment becomes less clunky and more affordable. Games like Doom VFR, Skyrim VR and Beat Saber are garnering rave reviews, and the release of all-in-one headset Oculus Quest earlier this year propelled VR gaming into the mass market.
As VR technology becomes both more sophisticated and more accessible, its applications beyond the gaming space are also taking off. From virtual showrooms to product demonstrations, leadership teams are exploring ways of using VR to boost their businesses. In fact, companies that fail to explore the potential of VR risk getting left behind: recent research finds that nearly three-quarters of small to midsize businesses will be experimenting with immersive technology by 2022.
But although VR has exciting marketing and product uses, perhaps its most game-changing corporate application is in employee training and learning and development (L&D). VR can replace traditional training methods with a highly realistic simulated environment where trainees can learn and practise real skills without any danger — something that’s especially valuable in high-risk sectors like oil and gas and healthcare.
VR technology has the potential to completely revolutionise the way that workplace training is delivered, transforming L&D into something that’s far more effective and engaging than methods like PowerPoint presentations and multiple choice quizzes. VR has been shown to improve the absorption and retention of information and it generates a wealth of actionable data that is hugely valuable to trainers and businesses.
These are some of the benefits that VR can bring to both workers and workplaces.
Overcoming distance and access difficulties
Virtual reality can provide an excellent training solution when classroom-based training is unlikely to adequately equip workers with the skills they need, but practising in real life is simply not practical. This may be because access to equipment is limited, workers are based in different locations, training scenarios are too dangerous to replicate in real life (for hazard and disaster training, for example), or training in real life is unethical (training on a medical procedure, for example).
Oil and gas company Shell used a VR environment to train their employees on emergency response procedures. In an emergency situation, workers need to be able to follow the right process while under pressure. To recreate these situations in real life would be dangerous, expensive and impractical, but by creating a virtual reality simulation of an emergency, Shell could give their employees the chance to practise their response without exposing them to any danger.
The highly detailed and realistic VR environment tricks the brain into thinking that the situation is real, provoking genuine emotions like fear or vertigo. This means that trainees can practise working in a stressful situation while they’re actually in a safe, controlled space without physical risk.
In the medical space, GE Healthcare is trialling a VR programme that trains radiographers to perform CTCA scans — a special type of x-ray that can identify patients at risk of developing heart problems. Using VR for this training is helping to overcome obstacles such as limited access to equipment, geographical and scheduling issues, and the problem of practising on real patients.
Engagement, absorption and retention
VR training immerses users in realistic, simulated scenarios; the VR headset eliminates visual distractions, and trainees can interact with colleagues and manipulate virtual tools and objects. This makes training more engaging and effective: according to a University of Maryland study, 40% of participants scored at least 10% higher in recall ability when using VR, compared to a desktop display.
Gamification can further increase levels of engagement, encouraging healthy competition between trainees and prompting employees to return to the training environment to improve their score. A VR training module created for logistics company DHL, for example, challenged trainees to stack virtual packages against the clock and against each other while sticking to standard operating procedures. Employees from across the world came together to train in the same virtual environment, with a global leaderboard encouraging competition. Eight out of 10 employees were keen to repeat the training so that they could get higher scores and move up the leaderboard.
The engagement benefits reach beyond the training itself, too. Staff who find their company training enjoyable are more likely to be engaged with their jobs and with their employer. DHL recognised that by increasing employee engagement through the use of VR training, they could increase staff retention bringing further cost and efficiency benefits to the company.
Truly measurable training
Perhaps the most revolutionary aspect of VR training, however, is the wealth of data that these platforms can capture, and the opportunities that this brings. A VR training platform can collect dozens of data points per user per second. Every action an employee takes in the VR setting — from an interaction with a colleague to a training checkpoint passed — can be fed into a dashboard and assessed by the employer. This data capture brings a multitude of benefits:
● Individual analysis
All the data is stored for playback, enabling trainers to review and analyse an employee’s training performance via the web or in the VR platform itself. They can identify where mistakes were made and help an employer understand when an individual may need further training or support.
● Group performance
Aggregate data from all employees who have been through a training programme can help employers to assess the performance of their workforce and how well processes are being understood and followed.
● Keeping consistent
The data provides a robust record of how, when and what employees have been trained on, eliminating the risk of important material being missed due to instructor error or concentration lapses, for example.
● Providing proof
Recordings can also be used as proof that training has been completed to a certain standard if businesses need to show that they’ve met the requirements of auditors or regulators.
● Continuous improvement
Employers can interpret and act on the data to continuously update and improve their training programmes.
● Proving the value
Data from the VR platform can help companies assess the value and effectiveness of their training, comparing performance over time and providing measurable outcomes and ROI.
VR: the new frontier of corporate training
VR’s impressive potential is finally being realised, in both the consumer gaming world and in the corporate training world. As the technology rapidly improves and business leaders embrace the benefits it brings, we’re likely to see a steep rise in the uptake of VR training.
With the help of VR, training can become engaging, effective, and data-driven. And as new features are added to VR platforms, the opportunities become even more exciting. Data capture, for example, could soon deepen even further to include things like eye-tracking, biometric data, and even brainwave monitoring.
VR technology encourages business to look at training in a different way, and equips workplaces with tools that can maximise performance. This results in workers who are more engaged and more capable, and ultimately in workplaces that are more productive.
We all know how powerful and productive data analytics can be for a business — but knowing is only half the battle.
By Hugh Simpson, Global Solutions Lead for Data & Analytics, AI, and Industry 4.0, Ciklum.
The big data and business analytics (BDA) market is predicted to hit $203 billion in the year 2020, up from $130.1 billion in 2016, according to research firm IDC. But sometimes you still need to convince colleagues, managers or decision-makers that data analytics is worth the effort and how it can be good for business.
Here are 10 ways to win over the sceptics and push them into action.
1. Focus on what you can do with the numbers
It’s all too easy to look at data analytics from a tech perspective, getting caught up in the statistics of how much data you can process, how quickly you can crunch the numbers or even how neatly you can produce data tables or charts at the end of the process. The problem is that colleagues may not share your excitement for such matters. Instead, you need to sell them on what they can do with that data and how it can serve as a tool to make their work easier rather than as an end in itself. If you’re looking for tips, take a leaf from your marketing department and remember that you are trying to sell data analytics as a benefit rather than a feature.
2. Emphasize the bottom line
When you’re dealing with a corporate hierarchy, you have to remember that almost anyone you speak to is looking to improve their own department’s performance along with the company’s profits. Give concrete examples of how data analytics could improve revenue or cut costs. For example, explain how analysis of the bounce rates and navigation paths of your website can show a point at which potential customers are abandoning ship. Alternatively, demonstrate how call center traffic analysis can offer insights on effective organization of staff shift patterns.
3. Incorporate the unknowns
Donald Rumsfeld once discussed “unknown unknowns: the things we do not know we do not know.” Make a point of stressing how data analytics can uncover patterns, trends or counterintuitive relationships that would otherwise get lost among the numbers. Almost any decision maker in an organization will love the idea of getting secret information, and data analytics is a legal and ethical way to make this happen.
4. Network thoughtfully
Though businesses have come a long way in recent decades, there’s still a corporate ladder in many organizations — and at every rung, somebody’s trying to justify their position. When you’re trying to get a decision maker on board with your drive for data analytics, consider their viewpoint. Could the patterns your analysis uncovers help them make smarter decisions, show off measurable “wins” or simply offer concrete results for which they can take the credit? If so, they might be more amenable to giving the thumbs up.
5. Take a storytelling approach
Everyone responds to different forms of communication, but human nature compels most people to pay attention to stories. Look for case studies from your existing work (or even your competitors’ work) that demonstrate how data analytics made a difference. Some audiences might find cold hard numbers more persuasive, but others will respond to personal accounts with identifiable characters for whom data analytics solved a problem or brought a breakthrough.
6. Turn a negative into a positive
Sometimes you’ll need to take a counterintuitive approach to making your case. For example, take a department head or executive officer who has no time for social media or anything it stands for. You can start out by sympathizing with their view and conceding that the way the company has been using Facebook or Twitter has not brought any obvious benefits so far. Then show how data analytics could be the key to unlocking its real potential. For example, what if you could show that mentions of your food brand peaked at 3 p.m. and you used that information to test a marketing campaign based on “beating the afternoon munchies monster” rather than delivering your usual message of nutritional value?
7. Simplify the subject matter
Mention “data analytics” to many people in your business and their eyes may glaze over — not so much from boredom, but because of a fear that the whole subject is inexplicably complicated. Be sure to convey that your systems are straightforward use, they don’t require specialist tech knowledge to query and the set-up is designed to be intuitive and responsive. Of course, this only works if it’s true, so put the groundwork into the system first.
8. Know your audience — and their challenges
Before pitching data analytics as a concept, take the time to work out how it can be a solution. For this, you’ll need to ask the different parts of your organization what their problems are. Don’t be afraid to couch this as unconstrained “blue-sky thinking”: That means not only do you ask questions such as “What’s the biggest obstacle to success in your work?”, but you also encourage people to think more widely through questions such as “If you could learn one useful fact about your product/customers/processors that’s currently a mystery, what would it be?” Knowing people’s pain points makes it far easier to explain why data analytics can help.
9. Don’t be afraid to get controversial
This isn’t an approach suitable for every target, so you’ll need to judge your audience. But if you’re dealing with a manager or exec who’s particularly competitive, there’s no shame in taking advantage of this fact. Search online for case studies involving data analytics by rival firms in your industry and show how they might be getting one up on you.
10. Look at the big picture
It might seem intuitive to try to talk big when making the case for adopting data analytics — stretching the bounds of credibility with the possible gains. But not every advantage you extract from the data will be large, so the key is to emphasize that they all add up. Consider the example of the British track cycling team that adopted a tactic of “marginal gains”: They identified every possible performance variable in their crew and equipment and tried to boost it by just 1 percent. These tiny improvements that may have seemed trivial made a huge difference in aggregate: At the 2012 Olympics, they won seven out of a possible 10 gold medals, setting nine world records in the process.
So, it’s clear that data is now a key business asset, and it’s revolutionising the way companies operate, across most sectors and industries. In effect, every business, regardless of size, now needs to make use of the accessibility of analytics in the future - or be left behind.
Charlie Brooker’s recent “Rachel, Jack and Ashley Too” episode in the latest season of Black Mirror poses a lot of questions about the future of artificial intelligence (AI).
By Ben Lorica, Chief Data Scientist at O’Reilly Media.
His creation of the “Ashley Too” toy is injected with the personality of Miley Cyrus, designed for fans to feel closer to her. In this episode, we witness a lonely young teenager befriend the doll and confide in her as a real-life best friend. With moving eyes, a dancing body and emotive conversations, the toy leaves us questioning how much personality our AI smart devices should have – or whether or not they should have them at all.
As robots become more human in their interactions, humans are of course more likely to form a connection with them. However, this episode is merely Charlie Brooker’s prediction as to what the future holds. It’s not yet a reality. Whilst AI smart devices such as the Amazon Alexa and Google Home assistant are entering our homes, they’re not yet necessarily equipped with the ability to detect emotion or respond to questions in an emotive manner. So, does this mean that AI will be limited in its success by its lack of emotional intelligence (EI)? Or is EI on the cards for AI to be even more successful?
Emotional intelligence and affect recognition
If one thing is for sure, it is that businesses are reaping the benefits of AI’s ability to free us from the more repetitive tasks in the workplace. AI is changing the nature of work. It’s helping to remove the mundane, enabling us to make more informed decisions with its analytical capabilities and its ability to wade through large amounts of data through machine learning.
Yet, according to a report from Gartner, EI accounts for more than 90% of a person’s performance and success in a technical and leadership role. With this in mind, it would be unlikely for AI to completely replace human beings in the workplace at this stage, given its lack of emotional intelligence (among other things). Emotional intelligence, deep domain expertise and a set of “soft skills” cannot yet be automated by current AI technologies.
Whilst AI is not yet equipped with emotional intelligence, there have been a lot of reports and studies conducted around affect recognition. According to the AI Now Institute, affect recognition is a subclass of facial recognition. It is a phenomenon said to be capable of detecting our personality, emotional state, mental health and our “worker engagement” based on images or videos of faces that it sees. The problem here, however, is that this is just a claim. There is currently no scientific evidence to back up statements that AI is capable of doing this.
Let’s take recruitment as an example. Often, we hear of companies utilising AI technologies to help with the recruitment process. AI biases including race and gender aside, there are major concerns around this. Without substantial evidence to confirm that the technology and affect recognition can actually detect the right skills such as emotion intelligence, it’s not a technology we can confidently put into play.
AI+EI: What’s next?
Whilst emotional intelligence is seen as a critical factor to success in the workplace, it is not an easy skill to teach. As human beings, each day, we are presented with a variety of decisions to make – many of which rely on our emotional competence to relate to individual scenarios. The next step in the development of this technology sits with those who build and design it. This will be an interdisciplinary endeavor and technologists will need to be able to collaborate with colleagues from other disciplines (social sciences, humanities etc). Engineers building AI also need to be representative of everyone. Cross-functional teams are required to ensure its design is representative of all. This means working with male and female engineers of different ages, races and backgrounds. To be truly emotive, AI needs to understand different types of people.
The next step in the development of this technology sits with those who build and design it. Engineers building AI need to be representative of everyone. This means working with male and female engineers, engineers of different ages, races and backgrounds. Cross-functional teams will also be required to ensure its design is representative of all. Security, privacy, ethics, and compliance issues will increasingly require that companies set up cross-functional teams when they build AI and machine learning systems and products.
AI and machine learning are appearing in many of the products and systems we interact with. You could call it a success already with its ability to remove us from more mundane and repetitive tasks. However, at present, AI lacks the ability to understand human emotions. Current technologies require large amounts of data and compute, transparency and explainability can sometimes be insufficient, and causal reasoning remains a missing component. Given the massive investments in R&D, expect continued progress in AI technologies in the year ahead. But, it’s not something that’s going to happen overnight. Brooker may have predicted the future – but it’s further away than we realise.
This issue of Digitalisation World includes a major focus on augmented and virtual reality technologies, with particular reference as to how they already are, and will be, impacting the enterprise world. The feature includes a couple of articles on how AR and VR are likely to find a role within the business world, along with a range of expert comments as to how and where AR and VR might just be making a difference in the workplace, both now and into the future. Part 2.
Ross Murphy is the CEO of digital publishing company, PageSuite, and custom app developer, SixPorts. He draws on his strong background in emerging technology to explore how augmented and virtual reality can benefit businesses across a range of sectors.
A recent report by Statista predicted that the combined market size of augmented reality (AR) and virtual reality (VR) is expected to reach $215 billion by 2021. While the emerging technology industry grows at a rapid rate, there are still people outside of the technology and digital sectors that are alienated by it. This poses a significant problem for business owners as many remain unaware of how it could benefit their business.
With the likes of Pokemon Go and Beat Saber particularly popular, AR and VR technology is traditionally prominent in the world of gaming. However, as the technology progresses and becomes more accessible it is being used in increasingly innovative ways to stand out from the crowd and to solve problems faced by businesses of all sizes.
Both augmented and virtual reality offer new possibilities in the way consumers engage with products and services. For instance, AR is being used by retail businesses to distinguish their brand and transform the customer experience. Consumers are increasingly short of time so they are prioritising convenience, which is something that could pose a problem for shops. However, AR technology has provided a solution. In 2017, Swedish furniture company IKEA launched its Place app, which through augmented technology enables its customers to see how their products will look in their homes before they buy them. The app helps potential IKEA customers to find furniture that fits in with the style and size of their home without even leaving the house.
Virtual dressing rooms are also becoming popular. In June, online fashion retailer ASOS announced the launch of the Virtual Catwalk on its app, which mutually benefits the business and the consumer. Through AR, it responds to the problem faced by e-commerce retailers of a large number of returns and helps people get a better idea of what the clothes will look like thanks to a 3D, 360° view of a virtual model.
With the rise of smarter cities across the world, augmented reality is playing a big part in making the way we live our lives easier and more efficient. There is scope for search engines to use AR to scan a high street and highlight particular businesses. Whether this is finding the nearest coffee shop or drycleaners, augmented reality allows search engines to interact with the real world. The technology can also be a boost for businesses as it can enable users to book a table at a particular restaurant or identify the services a particular business has to offer.
In a similar way to AR aiding our retail experiences, it can also enhance museum and art exhibits. The most significant way in which it can do this is by using augmented reality to give more detailed descriptions of the exhibits. This can be a great way of engaging young people and bringing something new to existing collections.
Virtual reality is becoming increasingly popular among architects to bring their designs to life. VR headsets can allow people to show their designs to clients in a more immersive way. BBC’s recent property makeover programme, Your Home Made Perfect, challenged two prominent architects to transform homes that were no longer fit for purpose. The final designs were then showcased to the client using VR headsets. Presenting designs in this way is truly unique and can really help architects stand out from their competitors. While the initial technology is a considerable investment, it can be hugely beneficial by helping to identify issues before a project reaches the construction stage and in doing so saving money in the long term.
As VR becomes more accessible, it is increasingly being used by companies as part of their recruitment process. For large companies that are looking to employ people from elsewhere in the world it gives them a more life-like experience than a video call, as it can give the employer a better indication of eye contact, body language and so on. Virtual reality can also be used to save organisations money on training. The US Army, for example, has launched a VR training programme that can help simulate battlefield conditions and different terrains without having to leave the army base.
The growth of AR and VR technologies is undeniable and due to improvements in smartphone technology, people increasingly have access to AR-compatible devices. However, as the technology is more complex virtual reality remains expensive so as it stands it tends to be larger businesses that benefit from it. There is also the issue of infrastructure, as high-resolution VR headsets require a lot of bandwidth and currently 4G cannot cope. Luckily with 5G gradually being introduced across the world, this issue is being resolved. Ultimately, as the technology progresses and 5G is more widely implemented, virtual reality will be increasingly accessible to businesses of all sizes.
The possibilities for how AR and VR can help your business appear endless, whether you’re a designer wishing to bring your digital drawings into the ‘real’ world or a manufacturer wanting to virtually test out a new product. By embracing this emerging technology, you will be able to stand out from your competitors and improve the service you offer.
SixPorts are specialists in creating innovative augmented reality solutions and developing virtual reality experiences for forward-thinking companies.
Could data centres offer a solution that saves sending millions of used electric vehicle batteries to the scrapheap?
By Leo Craig, General Manager of Riello UPS.
While the much-anticipated electric car revolution hasn’t quite hit the fast lane yet, there’s been an evident change of gear in recent years.
Sales of EVs in the UK during July 2019 were almost treble the amount recorded in the same month last year. That’s a 158% increase at a time when the overall car sales market remains relatively stagnant.
The total number of electric vehicles in Britain is fast-approaching 250,000, up from 3,500 back in 2013, solid progress towards the Government’s target of halving non-electric vehicle sales by 2030 and phasing them out completely by 2040.
This trend is set to continue in the years to come, both here and on a global scale. Depending on whose statistics you believe:
This continued growth poses several obvious questions, such as “how quickly will all the necessary infrastructure and charging points be installed?” and “how will the grid cope with demand if everyone suddenly charges their cars at the same time?”.
And then there are all the batteries… Lots and lots of batteries! More lithium-ion batteries are now used in EVs than in consumer electronics. The battery packs in cars need replacing roughly every 10 years, while for buses and taxis that figure is every four years.
As the number of EVs on the roads grows at such a rapid rate, so too will the number of old batteries.
While the recycling rate for lead-acid blocks often found backing up a data centre has reached 96-98%, the picture isn’t as rosy for Li-ion. Recycled lithium can cost up to five times as much as new, virgin material, making recycling such an unattractive option that the rate across the EU is currently below 5%.
Second Life As Stationary Storage
The good news is there’s another option to prevent the millions of old – and potentially toxic – batteries set to flood the market from being sent to the scrap heap.
While they might not be capable of powering a car or bus anymore, lithium-ion batteries can still store and discharge energy for another 7-10 years once they’re off the road. This is because they still retain up to 70% of their original power capacity.
This enables them to have a “second life” where they’re utilised for energy storage. For large-scale electricity users such as data centres, factories, or utilities, this opens up huge commercial opportunities.
Bloomberg NEF analysis suggests repurposing old EV cells costs roughly $49 per kWh. That’s a significant reduction on the $300 per kWh price of a new stationary storage battery.
So in theory, data centre operators could install a vast bank of reused EV blocks to provide emergency back-up at a fraction of the cost of a new system.
They could use the power stored in these battery stacks as a cheaper alternative to peak mains electricity, cutting energy bills. In addition, it provides a route to take part in demand side response mechanisms that help balance the nation’s electricity networks (and offer operators extra sources of revenue).
Challenges To Overcome
That’s not to say there aren’t any hurdles to overcome. Electric vehicle batteries vary significantly. Even though they’re made using similar technologies, depending on the manufacturer, there’ll be distinct differences in size, shape, and performance.
They’re used in different cars, in different climates, under different stresses. As a consequence, this means they’ll age differently too, which makes it difficult to combine them into a single coherent storage system. Used EV batteries need careful testing using expensive monitoring software to group cells with similar performance characteristics together.
That will raise concerns with mission-critical businesses in our industry. If there’s any uncertainty about the performance of these used batteries, isn’t it better to be safe rather than sorry and buy new, purpose-built grid storage batteries instead?
There’s also an argument that repurposing EV batteries for energy storage isn’t as effective as recycling in the long-term. Car batteries need to deliver lots of energy in a small footprint. Large-scale energy storage applications in data centres or factories don’t have the same restrictions on size or weight.
With materials such as cobalt and lithium increasingly scarce, there’s a case that recycling these precious metals to make new batteries makes more economic sense. Reusing them simply kicks the can down the road for a little while longer.
This is the current view of the world’s biggest electric car manufacturer Tesla, who focus on recovering the raw materials rather than reusing.
One of the firm’s founders and former Chief Technology Officer J.B. Straubel explains: “We expect 10 maybe 15-year life at a minimum from the Tesla battery. And, the degradation is not entirely linear.
“By the end of their life, the efficiency has degraded on every cycle, you see lower efficiency, the capacity will have somewhat degraded, and for a lot of reasons, it makes it very difficult to deploy those efficiently back into a grid setting, where you want high reliability and you want high predictability.”
Of course, a cynic could point to Tesla’s already extensive range of grid storage batteries and conclude it’s not worth their while to focus too much of their attention on the second life of EV cells.
First Sight Of Second Life
Yet while Tesla goes down the recycling route, several other major car manufacturers are already reusing EV batteries in large-scale energy storage projects.
Last year, Nissan launched the largest power storage project in Europe using both new and used batteries. It installed 63 second-hand EV and 85 new battery packs which feed off more than 4,000 solar panels attached to the roof of the Johan Cruyff Arena in Amsterdam, home of the football club Ajax.
The system is capable of powering the stadium for up to an hour – the equivalent electricity required to run 700,000 homes. But its primary function is to act as a generator that backs up the venue’s supplies and reduces the strain on the grid at peak times.
Renault is another car maker tapping into EV battery second life with an ambitious project across three sites in France and Germany.
Due for completion next year, the storage system will harness the energy of 2,000 batteries – at least 60 MWh – to power around 5,000 homes.
One advantage the French manufacturer has is that while it sells customers the electric car, it only leases the batteries. This means it retains the ownership of the packs and can take advantage of the second life opportunities by offering drivers upgrades to newer, bigger capacity batteries over the car’s lifespan.
Looking To The Future
Reusing batteries for secondary storage increases their lifetime value while reducing overall cost.
In years to come, large-scale energy storage will play a pivotal part in harnessing the power of renewables and balancing the UK’s electricity network.
We only need to look at the recent power cut that affected many parts of England and Wales – battery storage assets picked up some of the slack after when two large generators disconnected from the grid, which helped resolve the blackout relatively quickly.
That’s before we even consider the impact that the concept of “vehicle to grid” could have on the wider electricity network. It works in a similar way to how data centres feed surplus power back into the grid via demand side response.
EVs are parked up and unused for more than 90% of the time. Instead of that stored energy sitting there doing nothing, it can flow back into the network at times of peak demand.
We shouldn’t lose sight that lithium is a finite resource too. There’s only 350 years’ worth left at current mining rates. First life batteries (from EVs) repurposed into second life (for energy storage) and then finally recycled is the ultimate end goal everyone should aim for.
Closing the loop on lithium-ion battery blocks doesn’t simply come down to economics or environmental concerns, it’s a moral issue too. According to investigations by Amnesty International, much of the cobalt used to stabilise Li-ion cells is mined in dirty and dangerous conditions, often by children aged as young as seven.
The data centre industry doesn’t perhaps enjoy the best of reputations, despite our efforts to improve energy efficiency. As electric cars become the norm rather than the exception, perhaps we should take the opportunity to show leadership on the issue of second life EV batteries.
Not only will this help tackle a ticking environmental timebomb, it will go some way to securing our electricity supplies, while at the same time also offering operators commercial gains in a hugely competitive marketplace.
In recent years, the world of Business Intelligence (BI) has been turned upside down. Data became big, organisations adopted cloud computing, and spreadsheets took a backseat to actionable data visualisations and interactive dashboards. Self-service analytics grabbed the reins and democratised the world of data reporting products. Suddenly, advanced analytics wasn’t just for the analysts.
By Naveen Miglani, CEO and Co-Founder at SplashBI.
In 1958, a computer scientist, Hans Peter Luhn, published an article titled “A Business Intelligence System” in the IBM Journal of Research and Development that would later become the foundation for how BI is understood today. Luhn’s article suggested using technology to simplify the process of gathering data rather than sifting through mountains of information by hand. Today, we understand BI as such; using technology to compile and analyse data, translate it into useful information, and then making strategic decisions based on the results.
The recurring trend in next-generation BI tools is that of simplicity. Complex data analysis has become a breeze with the introduction of self-service analytics platforms. Advances in BI technology alleviate the stress and labour hours of gathering, sorting, and using data to make informed business decisions. But how have these changes affected businesses in the last few years - and what’s to come.
Self-service analytics
Self-service analytics has consistently topped the list of BI trend predictions each year, showing the increasing accessibility of BI tools and the positive impact of putting data back in the hands of individual teams, departments and leaders within organisations. The rising adoption of self-service analytics enables users to gain deeper insights to drive data-focused initiatives across the entire organisation—without having to rely on IT.
The rise of self-service analytics has also brought more attention to the growing necessity for modern organisations to adopt a data-driven culture. Businesses all over the world are using elegant visualisations and dashboards to tell their data story, and they’re doing it without using up a massive amount of IT resources. As advances are made in BI technology, the process of implementing a BI tool has become much less of a daunting task. Implementation and adoption time have been almost cut in half, data integration tools stepped into the ring, and talk of data governance/security solutions became common watercooler conversation.
Integrating technology
2017 was a major year for the BI industry. Significant advances were made in the way new technology integrated with existing BI processes, along with the development of tools that allowed data from separate applications or data stores to unite and display the big picture. The cloud was widely adopted due to advanced security and accessibility. Machine learning increased revenue for businesses by tracking buyer behaviour and analysing databases faster than ever before. AI became more prominent, and trials began to determine if AI could eventually replace human data scientists altogether.
By 2018, data analytics became a routine part of daily duties for most organisations. The value of using a BI tool had become a given, but the question then moved to choosing the right tool to fit an organisation’s unique and specific needs. Leaders began to take a look at common pain points in the business and started to learn more about how they could get the most value from a BI tool by asking questions such as, what do we want to achieve from analysing our data? How can BI help us reach our business goals? How can we use data to improve employee retention? Or measure turnover? Can we see which product drove the highest volume of sales in Q1? Could these insights really help us locate and obtain net new clients?
BI has never been a one-size-fits-all answer. That’s the reason it initially gained popularity, as different departments have different data. Sales won’t need the same Monthly Advertising Report that Marketing will use to create next month’s budget. BI was the hottest new tool that could help any person, in any position, in any company use their data to make fact-based decisions. These custom data reports guided businesses in the direction of the most important metrics; whether it’s HR, Marketing, Sales or Finance.
BI now and in the future
BI and data analytics technology is constantly evolving and the market shows no signs of slowing down. Business Intelligence makes data of any kind easy to digest with stunning visualisations, detailed historical analysis, and customisable reports. In fact, by the end of 2019, the Global BI and Analytics Market is expected to grow to $20 billion.
In 2020, experts say we will continue to see increased adoption of BI tools among businesses of all sizes that hope to speed up their organisation’s journey to success. Retail, construction, healthcare, banking and transportation are expected to make up the majority of new adopters. Additionally, the way data is created and handled will experience significant change in the coming years.
But what does the far future look like for BI? What was once just a tool for pinpointing patterns in an organisation’s data, has evolved into a robust, real-time solution focused on using hard and fast data to not only see a snapshot in time, but to view the entire picture. BI enables companies to make the best possible decisions using their own data, and the organisations that capitalise on this technology that will reach their business goals.
This issue of Digitalisation World includes a major focus on augmented and virtual reality technologies, with particular reference as to how they already are, and will be, impacting the enterprise world. The feature includes a couple of articles on how AR and VR are likely to find a role within the business world, along with a range of expert comments as to how and where AR and VR might just be making a difference in the workplace, both now and into the future. Part 3.
Andy Barr, CEO and founder of www.10yetis.co.uk comments:
“Virtual Reality (VR) gives brands a particular way of letting consumers see their brand in new and exciting ways. For example, using a VR experience at a trade stand or consumer exhibition is always impressive and offers an excellent engagement and talking point between the brand and the person using the VR headset. When trying to draw in new customers, it is important to wow punters, and VR is a great way of doing so as it provides something new, exciting and will help them picture what a brand or company can realistically offer them.
In terms of internal comms, VR can be a great asset for communicating change and more often than not big CO’s can encourage big changes to programs and departments through VR, which helps to get more people invested internally.
VR is one of the best ways of communicating difficult messaging because when it is in use it requires total commitment and concentration. If we look at the construction sector for example it may be hard to envisage what a final build looks like but with VR people will be able to see what a building could look like once it is developed. This would be particularly useful during a public consultation or when you are trying to win the rights to go ahead with a construction.
VR is still a relatively new and unexplored technology, and with more development will come more accessibility. We are likely to be in a position in the coming years where VR is used widely across a range of sectors due to how versatile it will become. We have recently began providing it as a service for clients here at 10 Yetis, and it’s proved very popular with our clients. (You can read more about it here https://www.10yetis.co.uk/work/case-study/virtual-reality-video-experience).”
The case study can be read below as well:
Background
Dowdeswell Estates is a Cotswold’s based provider of unique and prestige construction projects, ranging from luxury residential to high quality hospitality. The company is renowned for creating and developing stunning buildings and interiors, set amongst some of Britain’s most outstanding areas of natural beauty.
The company approached our creative team at 10 Yetis Digital following the success of our launch campaign for Julian Dunkerton’s local brewery and brand – Dunkerton’s Cider – and wanted to engage us for an exciting upcoming event aimed at developing key stakeholder relationships and celebrating the brand’s amazing success so far.
Brief & Objectives
The bespoke, luxury design and construction company had engaged in a sponsorship deal with the Gloucester Tall Ships Festival and sought to create a groundbreaking, exciting and engaging experience to take their private client event on one of the incredible tall ships to the next level.
As a team, we saw this as the perfect opportunity to flex our creative muscles and build on our previous video service and experience with the budding new medium of 360 video and virtual reality (VR).
Strategy & Tactics
When generating ideas for this project, we quickly decided that whatever we chose to put forward, Dowdeswell’s stunning portfolio of grade-listed buildings had to be at the centre of the idea. This led us to consider a variety of creative tactics and mediums, with both VR and AR being considered as really strong options for the project. Both mediums would offer an interactive and memorable experience that would enable us to showcase Dowdeswell’s incredible portfolio of local developments.
Dowdeswell Estates settled on VR over Augmented Reality (AR), as it would allow the brand to showcase and describe the attention to detail that they adopt when developing any of their properties a lot more efficiently than with AR, as well as using the available space on the boat more efficiently. We put forward the idea of allowing guests of the party to use virtual reality 360 video to look around the inside of the properties, while animated pop-ups appeared to explain the high quality, fine details of each room (think Jarvis from Iron Man. Really!).
Once we had sign-off on the concept, we began to plan the technical aspects of the shoot. Due to the tight turn-around of the project (8 days from concept to final assets), we had to be as efficient as possible. For this reason we chose to use a Go-Pro Fusion over a larger more cumbersome 360 camera, as it has an incredibly light form-factor, a great companion app for capturing content while staying out of shot and it still outputs in 5.2k, which was the lowest we were willing to go, to avoid sacrificing quality.
Working in collaboration with Chris Gage (Marketing & Communications Manager, Dowdeswell Estates) we identified three properties that the company felt passionate about and wanted to showcase. These were – Dowdeswell Estates HQ, Dunkerton’s Cider Shop and The House at No. 131, the latest addition to The Lucky Onion group. We then developed the shot list based on the features of each property that Dowdeswell wanted to highlight and offer further information on.
The capturing of the content was relatively straightforward; shooting both 360 video and images of the property that we could utilise for the final editing of the virtual reality experience. This allowed us to mix the two formats together, using the video for the parts of the experience that contained motion and the photography for high quality stills. Following this approach, we were able to ensure the quality of the finished product remained as high as possible, while keeping export times to a minimum due to the quick turnaround required.
To create the 360 element of the experience and stitch the video and photography together into a seamless transition, we used a mixture of the Go Pro Fusion Studio and the Adobe Suite. Adobe After Effects was then used to create the pop-up details, before positioning them in 3D space and aligning the proportions of the box so they wouldn’t warp as the viewer looked around the property. This was one of the harder aspects, as it takes a lot of trial and error to minimise said warping. Thankfully, there is now a whole plethora of plug-ins available to help with this, as VR and 360 content has really started to make an impact on the industry. To bring the whole piece together, we added atmosphere and music to the scene, to really instil the full experience.
Having previously worked on 360 projects that solely relied on the use of panoramic tripod heads, it was refreshing to see how far the technology has advanced, to the point where we could now quickly deliver high quality 360 content to a high standard, in a very cost efficient manner. It is a really exciting sector to be experimenting with, as technological improvements allow us to try more and more creative concepts.
Here’s a peek at one of the experiences we created…
Results
The reaction was really positive and there was a lot of excitement around the experience. We created a dedicated area on the ship for the experience that included a ‘message in a bottle’ to keep in line with the theme. Chris Gage hand-wrote this personally for the event.
We caught a lot of the guests completely off guard, as the last thing anyone expects to find on a ship is a virtual reality experience, but it was very well received by both guests and the local media that attended. Most of the guests got involved in the action and were really impressed, not only with the experience itself, but also the amazing projects that Dowdeswell has been involved in.
The company reported extremely positive feedback from key stakeholders that attended the event and received queries seeking their services the day after…which was a Bank Holiday!
Testimonial
“At Dowdeswell Estates, we pride ourselves on working with local trades and craftsmen and we also carry that philosophy into our Marketing and Creative projects.
We knew of 10 Yetis from another event we had been a part of and were impressed with how they had handled a particularly demanding client and delivered the event under pressure with an ever changing brief, so when it came to us hosting an event at the Gloucester Tall Ships festival, it was an easy choice as to who to work with.
The initial meeting was informal, relaxed and very much a collaborative experience with myself and the 10 Yetis team of Rob and Jamal led by Kalli; throwing around numerous exciting ideas and possibilities on not only how to best showcase our product, but to also ensure that our guests had a unique and highly memorable experience. We ended up going for the Virtual Reality goggle experience which was filmed within some of the Grade II listed properties we’ve restored, and whilst a complex and very new project to the team, their confidence in achieving it, left me in no doubt as to their ability to deliver and immediately got me excited for the event and the reaction of our guests to the VR experience.
During filming and editing, 10 Yetis took care of everything and as requested, I was updated throughout the whole process at regular intervals via WhatsApp with the latest info and drafts. Due to this collaborative and open communication, as a client, I had complete confidence in the team’s ability to deliver the assets to the high standard and quality I demanded. Come the event itself, the team couldn’t have been more supportive, taking care of guest lists, acting as tech support and also capturing social media assets throughout the day.
In short, the event itself was a complete success. The feedback we’ve been getting from our guests has been incredible and the VR experience had the desired effect with requests for meetings concerning discussions over new projects coming in the very next day despite it being Bank Holiday Monday. With the 10 Yetis team, you can use their services for one off events or projects where you will be guaranteed a great client experience with great value for money. However, for us, we see this as the start of a very collaborative and creative relationship and cannot wait to work with them again.”
Chris Gage, Marketing & Communications Manager.
Technology is transforming our lives. It's revolutionising the way we buy groceries, send gifts, listen to our favourite music, communicate with loved ones, and much more. Because it's so convenient and ubiquitous, we expect ultra-fast and highly personalised services from all the companies we engage with.
By Tiffany Carpenter, Head of Customer Intelligence at SAS UK & Ireland.
As a result, businesses are scrambling to launch projects that will enhance their customer journeys - often with little strategic direction and with mixed results.
For example, in its recent Darkness of Digital Shadows research paper, SAS found that 93 per cent of organisations don’t have the analytics capabilities required to predict customer behaviour accurately. Even more concerning: Less than a fifth of companies are prioritising the customer experience over their internal sales targets.
Shaking off outdated operating models
Most companies organise their businesses around specific products, processes and channels. This inside-out approach can create tedious and frustrating customer interactions.
For example, a customer is placed on hold for an extended period of time in your call centre. When finally connecting, she's told she needs to speak with a different department. Such disjointed customer service can cost your company dearly. And it’s a direct result of an organisation that is built around internal fiefdoms rather than customer needs.
By contrast, organising your operating model around your customers can yield great rewards. It inspires customers to remain loyal and recommend your services to friends and family. The results are lower churn, higher revenues and greater market share.
But how? Experts agree that there are three key ingredients for a customer-centric company: timely data, sophisticated analytics and a strong decision-making culture. If these elements are out of balance, projects focused on improving your customer experience will quickly derail.
Best-laid plans
Many companies recognise the power of a customer-centric operational model. With good intentions, they launch projects to mobilise data and improve customer experience across multiple touch points.
However, siloed customer experience improvement initiatives rarely generate good outcomes. While one specific department may meet all its core KPIs, overall metrics may show little sign of improvement. This is because customers view their interaction with organisations holistically. It doesn’t matter if your sales experience is excellent if your billing department keeps making mistakes. Customers won’t think twice about looking elsewhere.
Another common problem is that siloed customer experience improvement projects tend to be top-down, one-hit failures. They involve a lengthy development and testing process, with no provision for continuous iteration and enhancement. By the time the solution is ready to go live, the data is outdated. So you're right back to square one after wasting a huge amount of time and resources.
Similarly, companies often focus on using a centralised data lake instead of making data accessible to the business. By the time the data has been discovered, extracted and analysed, the conclusions are outdated. Once again, these projects occupy a huge amount of time and resources for little reward.
Seeing the world through your customers’ eyes
Let’s define an intelligent enterprise as a company that customers not only tolerate, but actually enjoy interacting with. These companies take a holistic approach to customer experience instead of dividing efforts across specific products, processes or channels.
Intelligent enterprises closely align data and customer intelligence platforms with their systems of action. This means sales agents have information while serving customers and personalised offers are delivered to customers via mobile apps or websites.
For example, Telefónica recently launched a project to reduce customer attrition. It used SAS Intelligent Decisioning solutions to analyse large amounts of customer data and better understand the reasons why customers leave. It found that running out of data was a key trigger. Customers found the process of buying more credit or switching to a different package was more hassle than simply changing to another provider.
The company took action fast by developing the capability to analyse real-time customer usage data. As a result, it can see when customers are running out of minutes or data. By correlating this information with data about each customer’s subscription packages, Telefónica now creates personalised offers. Instead of running out of minutes, customers receive a notification about their current data levels, along with a link to a tailored mobile data plan.
Since going live with this real-time deal personalisation engine, Telefónica has seen a 500 per cent increase in the uptake of new offers. Customer churn rates have also been dramatically reduced.
Being able to understand customer behaviour, create personalised offers and increase wallet share can have a massive impact on a company’s bottom line. Don’t just imagine how your organisation can be customer-centric; take the first steps today.
Companies increasingly want – and expect – their huge data resources to support key strategic change. Shifting from revenue driven to profit driven; moving the business upmarket; surely data can help? It can – but it isn’t, because companies are still too hung up on measuring their past performance – the ever-present Key Performance Indicators – rather than asking the big questions that deliver true business value, the Key Performance Questions. The result is that all the effort that is put into fishing in a sea of data, with a plethora of increasingly sophisticated analytics tools, including Artificial Intelligence, fails to deliver any real business insight. The just ‘store everything’ data model, promoted so heavily by IT functions has not provided the promised machine-generated insights.
This is because the whole approach is back to front. Companies are currently not data driven; they are being held to ransom by the technical data handlers. If companies are to deliver value from the extensive (and expensive) investment in data, Peter Ruffley, Chairman of Zizo, insists it is time to stop measuring and start questioning.
Measurement Addiction
It is hard to find a business that hasn’t embraced a ‘data driven’ culture, often as part of digital transformation. Yet while the concept of leveraging data to improve business direction and performance is laudable, that is, sadly, not what these organisations are achieving. They are simply tracking performance. Adding Key Performance Indicators (KPI), extending the depth and reach of measurement metrics, even drilling down for more detail, remains an essentially backwards-looking approach. This culture of monitor and measure is not one that actively uses data to better understand performance and drive change.
What is required is a simple but essential change in mindset; a shift from the tracking of KPIs to the strategic relevance of Key Performance Questions (KPQs). The issue is not, ‘did we meet our targets?’ but ‘how did we meet our targets?’. Not, ‘how many shirts have been sold this week?’, but ‘are we selling shirts to the all the potential shirt customers?’. Essentially, is this the right direction for the business?
Business Direction
Take a parcel delivery company, for example, wanting to use data to support its strategic shift from being revenue driven to being profit driven. The obvious KPI is profitability per parcel – but how does that help strategically? The KPQ is who else can we sell our most profitable service to?
Or the holiday company that has decided to shift up-market towards more expensive and hopefully profitable packages. Measuring every possible KPI to track performance has minimal value and certainly doesn’t identify the customers that haven’t been attracted to the up-market offer, where they holiday and what compelled them to buy. The KPQ’s to be asked are not only who are these non-customers, but did marketing reach out to the right audience in the first place?
The real problem for any company that has created a state-of-the-art cloud-based data lake is that it only contains the data for looking at KPIs and will not even have the right data to support KPQs and have no strategy for getting it. No wonder so many companies default to the track and measure paradigm. It’s all they can do.
Data Myth
This underlines one of the very real issues facing businesses today – the only apparent way to deliver any value from current data models is to add KPIs. Why? Because the data scientists and technology vendors have propagated the myth that computing power can do anything and solve anything, providing it has access to data and unlimited computing power. It can’t, not on its own. Data science needs direction. Data needs preparation. According to recent 451 Research, the biggest barrier to successful machine learning deployments is a lack of skilled resources, followed closely by challenges in accessing and preparing data. And the more data-driven the organisation, the worse the problem. Simply adding more data sources without direction is adding cost, not delivering value.
Taking the KPQ approach turns the entire model on its head and brings much needed direction. Rather than layering tools over a sea of data in a blind and typically futile attempt to realise true business value, KPQs focus the attention. A KPQ identifies the subject matter experts within the business who can give insight into those questions and prompts the essential discussions that reveal the data sources required. Suddenly, rather than looking at 25, 30 even 100 data sources, the KPQ may require analysis of just five or six.
Leverage Innovation
This opens the door to leveraging new technology, to experimenting, building prototypes and using AI to dig deeper into the answers. Indeed, there is no need for the data scientist: the combination of the right question, the right subject matter experts and the right, well prepared data, and then the speed with which the business can unlock insight can be truly transformative. Business experts will immediately see and understand trends; they will have the context and knowledge to identify insights that have business resonance.
In many cases that data will not be within the organisation, it will be third party or generic market data that will need to be blended with internal data resources to deliver insight. And this is where the compute power and the clever technology does have a role to play; where AI can be very quickly used to reveal whether there is any meaningful correlation within the data at all. Data landscaping provides unassailable information regarding the existence – or not – of mathematically identifiable connections between data items. If there is not, a business is either looking at the wrong data, or that data is incomplete. And this is an issue that companies will need to embrace: KPIs measure existing performance, based on internal data sources. KPQs may well demand additional external data and computing resources.
Time for Change
Something has gone very wrong with the concept of data-driven business. Rather than providing insight to support essential business decisions, too many companies are simply sitting back and hoping that the vast quantities of data being collected will – almost magically – provide the elusive gold dust of fresh and valuable insights. It doesn’t work that way – however smart the AI or machine learning. Measuring the business in ever greater detail does not create a data-driven business. Where is the change? Where is the true strategic insight?
Data resources will not deliver value without direction and senior management need to step up and ask the questions. What is the biggest business challenge? Can the data provide that insight? Are the potential gains worth the investment? Unless companies begin to proactively question the data rather than continue to monitor performance nothing will change, and the concept of being truly data-driven will remain a myth.
This issue of Digitalisation World includes a major focus on augmented and virtual reality technologies, with particular reference as to how they already are, and will be, impacting the enterprise world. The feature includes a couple of articles on how AR and VR are likely to find a role within the business world, along with a range of expert comments as to how and where AR and VR might just be making a difference in the workplace, both now and into the future. Part 4.
As remote working takes off, it isn’t hard to imagine how replicating a working environment could be a huge gain for an increasingly-distributed business world, says Nigel Davies, founder of digital workplace Claromentis:
“A main entry point for VR in business has been simulation. We are already seeing VR simulators being deployed for training, with Walmart using it to test shop floor workers’ responses to various situations, such as in-store events with big crowds, clearing up spillages, and customer service issues.
“It’s also being used by companies that need to offer hands-on experience working with high-risk equipment in an entirely safe environment, and by those that want to train HR into how to terminate jobs without eliciting anger or tears from those being let go.
“There is still plenty of room for development. Soon we’ll be able to simulate almost any physical workplace scenario to turn those ‘out of the blue’ moments and anecdotes into things we are prepared for and have ‘experienced’.
“We'll also see more and more businesses using VR to create memorable experiences as part of their PR and marketing efforts. These VR-led events will need to become more creative once the novelty-factor of wearing headsets has worn off.
“I imagine the next step will be to deepen the experience for remote workers inhabiting the digital workplace. The workforce is undergoing a huge shift, as more workers push for the right to work from wherever they choose, rather than from the office. There are so many benefits to wellbeing and productivity, but there are a few downsides – some report feeling isolated and lonely.
“These feelings particularly affect people who thrive in social situations – extroverts – yet workplace technology can’t yet fulfil our deep-rooted desires for real and meaningful human connections with our colleagues.
“That could change when VR is able to drop us into a highly realistic metaverse with our co-workers, represented by avatars. As well as obvious benefits for collaboration, this has interesting ramifications for unconscious bias given that your avatar needn’t look like you, or even human.
“There is also huge potential for the way we do business internationally. With concerns over the climate crisis likely to deepen in the next decades, air travel will increasingly be seen as frivolous and damaging – hardly good for corporate responsibility. VR could put all parties in the boardroom without anyone having to move.
“The big question companies need to start asking is: how far do we go with VR? The possibilities are endless.”
Robert Brown, AVP for Cognizant’s Centre for the Future of Work, comments:
“Augmented reality (AR) is set to change the way we work more than ever by bringing new types of work to life through more AR analytics and better decision-making. For example, big-box retailers are already using thousands of AR headsets to dynamically optimise training for associates and managers alike. Within the enterprise, AR will have a huge impact on internal business processes by bringing together remote workers, saving time and money, tackling difficult tasks with “see-what-I-see” capabilities, and improving productivity, engagement, and competitiveness, all the while building on the trajectory of previous transformative digital technologies. For example, SMAC (social, mobile, analytics and cloud) technologies have made great headway over the last decade in digitising clunky, manual, paper-based and rote-and-repetitive work processes; however, there is still plenty of hands-on work that takes place today, particularly with legacy systems of record and engagement. AR will be particularly impactful for workers who need information while using their hands, such as those working in airplane maintenance or healthcare professionals conducting medical procedures like inserting an IV.
“We will likely see AR being used daily across a range of industries, but the jury is out on exactly when that will be. The widespread benefits of AR in enterprise are already being realised, though, with Cognizant research finding that one third of respondents have already scaled their AR initiatives into full implementations. In fact, the research predicts that companies are anticipating an 8.2% average top- and bottom-line growth by 2022 by using AR.
“It is becoming increasingly difficult to ignore the growing number of companies with AR implementations that are able to generate real results, as demonstrated in our research. Other factors will also drive its adoption – for example the growth of 5G, which will bring rapid data accessibility and speeds needed to process more connected AR experiences. The development of natural language processing will also be a driving force, as many people tend to find hand gesturing unnatural, whereas voice is an easier and more natural form of communication to understand for most people.
“At the moment though, one of the main barriers to mainstream AR adoption in the business world is a perception issue. Many still associate the technology with gaming, entertainment and clunky headsets, rather than a strategic business tool.
“For this reason, for now, we will continue to see an emphasis on B2C applications of AR such as American Apparel allowing its customers to try on its clothes with AR technology; however, it is becoming clear that interest is growing in focusing AR on internal work and business processes.”
Chris Attewell, CEO of Search Laboratory, offers the following observations:
The future of AR/VR in retail
Augmented Reality (AR) and Virtual Reality (VR) are on the rise and companies from all industries must capitalise on these new technologies early in order to stay ahead of the game as they continue to grow and develop.
As the technology becomes more easily accessible, we will see brands adopt AR and VR to give customers a more immersive experience. For retailers, there is huge scope to use this technology as part of their digital marketing strategy to increase sales, reduce returns and improve the customer experience.
Increase sales and reduce returns
Online shopping is on the rise, but so are online returns as consumers get the products and decide they are not quite right. AR and VR will help online retailers reduce return rate by providing a digital ‘try before you buy’ solution.
We have already seen some brands adopt AR technology to do this – for example, IKEA already have this technology in place with their AR Place app. The app allows consumers to see how furniture would look like in their home, to scale, to better understand if it’s right for them.
We will also see the rise of at home VR shops, where customers can use VR headsets to ‘step into’ an experience which resembles a brick and mortar store, meaning they can see and interact with products before deciding to buy.
Improve customer experience
AR will also be used to enhance the customer experience in the high street. For example, AR mirrors are already in use by some brands, and allow customers to try on clothes or makeup before they buy without the need to wait in lengthy changing room queues or use samples.
Other options include having access points customers can scan to find out more information about products including reviews and pictures, or having VR hotspots where they can see your brand’s USPs such as a tour around where it is made.
As organisations become increasingly interconnected, global supply chains also grow far more complex and vulnerable. That means when chaos strikes – whether in the form of natural disasters, political upheaval, cyberattacks or market uncertainty – these events impact both suppliers and customers. Martin Clothier, Technical Director at Columbus UK, explains why the need for businesses to ‘expect the unexpected’ has never been greater in these volatile times, and how using predictive analytics will be key to effectively react to unforeseen challenges.
In the UK, Brexit has prompted businesses of all sizes to weigh up how any number of potential outcomes may impact their operations, and how they can effectively identify and manage threats to their business. In fact, recent research by Vuealta reveals that uncertainty over Brexit has resulted in more disruption to supply chains in the last five years than natural disasters and cyberattacks combined.
But is there a realistic opportunity here for businesses to not only react to events outside of their control, but use them to gain a competitive edge?
First – ask smarter questions of your data
It all comes down to the ability to use predictive analytics to manage supply chain risk. For instance, businesses can leverage machine learning techniques, such as regression analysis, which analyses historical demand to help predict product sales, allowing them to adjust production accordingly.
In 2019 and beyond, techniques in predictive analytics have advanced to such a point that they can combine traditional analytics with other external sources – including labour, weather, exchange rates or commodities markets, to allow you to ask much more intelligent and business defining questions of your data. If your business is reliant on migrant labour in your workforce, for example, you can now run scenarios based on labour costs and other data to determine the impact to the organisation should cross-border activity be negatively affected by Brexit.
Second – understand potential supply chain threats
Let’s look at another scenario and look across the entire supply chain. We’re seeing an increase in extreme weather around the world, and meteorological data is proving a novel data source that can help inform business decisions alongside transactional data. Every enterprise resource planning (ERP) system has a vendor database which can be tagged with a geographical location, and this enables businesses to start to question the likelihood of weather events – such as a typhoon hitting Asia in the next six months and how it will impact their supply chain.
You don’t need to be a data scientist to extract value from your data either. Any business decision-maker can now ask natural language questions such as: ‘What’s the average order quantity of this product in March?’ and get those answers back in a timely fashion. It is also possible to leverage an existing ERP system to create map-based visualisations, and tools such as Microsoft PowerBI mean this ‘single source of truth’ is readily available to any stakeholder across the business. It’s even possible to build a supply chain dashboard, which identifies any risk in the supply chain and classifies the risks to a traffic light system depending on their severity.
Third – extract insights from a single data repository
Having a modern data repository that can sit in the cloud and act as a data hub between your different applications is the third significant pillar of effective predictive analytics.
As more organisations implement both their ERP and customer relationship management (CRM) solutions in the cloud, they require a single repository for both transactional data and aggregated analytics data, which can be used to extract actionable insights.
Take the Microsoft Power Platform as an example. At Columbus we now provide customers with a common data model that sits between cloud applications. The customer can use this central data hub not only for reports, but to make data available to employees in the field that previously wasn’t easily accessible. The customer is able, through the platform, to create their own PowerApps that expose this data.
Getting ahead of the game
Machine learning is like a digital crystal ball for businesses – it can help predict the future, while artificial intelligence (AI) gives business decision-makers the ability to ask smart questions in a natural language and unlock data insights. Increased accessibility of data enables the worker on the shop floor, in the warehouse or out on road, to have the data visibility they require to make relevant, informed business decisions.
It’s impossible for businesses to see into the future but using predictive analytics is the next best thing. Advanced technologies such as machine learning and AI enable businesses to react to uncontrollable threats in the most effective ways possible – which could be the difference in ensuring future success and gaining a competitive advantage.
Remote Patient Monitoring (RPM), the measurement and analysis of a patient’s health status without needing to be physically present, enables a whole new innovative, cost-effective, patient-friendly approach to healthcare. Nadia Tsao, Senior Technology Analyst at IDTechEx explains how new technologies are revolutionizing RPM, making it a key enabler to drive down healthcare costs while achieving the highest standard of care for patients.
Traditional RPM is evolving at a rapid rate due to a number of external factors. We are seeing new market pressure coming from giants such as Apple, Google and Amazon looking to manoeuvre into the healthcare industry. Recently, Amazon announced its “Haven” project, a joint healthcare venture with JPMorgan Chase and Berkshire Hathaway to improve access to primary care, simplify insurance and make prescription drugs more affordable. With huge R&D budgets and a wealth of technology at their fingertips, their disruption potential is very high.
Changing population demographics are themselves sparking healthcare reforms. As the demographic shifts and results in ageing populations in both emerging and developing nations, the prevalence of chronic diseases will continue to rise and health systems are under pressure to change; prompting new ways to deliver healthcare. RPM is now being explored more seriously as a method of managing rising health costs due to chronic conditions.
New waves of RPM reform
With a finite number of clinicians and caregivers, it’s clear that smart technology will be used to ease the pressure on human resources. Wearables are an obvious method to monitor remote patients and deliver data to the relevant healthcare expert. At IDTechEx our analysts have tracked three waves of sensor and wearables development in RPM.
The first wave saw early sensors developed and used in healthcare including hearing aids and Holter monitors. The second wave brought sensors that were developed primarily in other industries but then made the jump into healthcare and wearable devices over time, such as the use of smartphones and smartwatches in health and fitness tracking. The emerging third wave is far more targeted, with sensors ‘made for wearables’ and developed with key properties in mind like flexibility, comfort and low power usage. These are less commercially mature, but examples such as mobile cardiac telemetry and continuous glucose monitoring will see the strongest growth and relevance in the long term.
The benefits of RPM are far from remote
RPM leverages a range of technologies and services to allow for the monitoring of patients both inside and outside of conventional clinical settings. RPM can measure anything from medication tracking, temperature, movement and heart rate to blood pressure, glucose levels and oxygen levels. Applications of RPM are widespread, from monitoring and improving patient adherence, clinical trial monitoring and pre/post-op monitoring to more diverse applications such as monitoring diabetes, dementia, infertility and heart failure.
One of the primary benefits of RPM technology is that healthcare organisations are able to provide care to patients beyond regularly scheduled visits and typical opening hours. This reflects the industry-wide move toward decentralising care, easing the pressure on overstretched hospitals, clinics and pharmacies. Sensor-based wearable devices fit this style of care, they provide real-time patient medical data which, when processed effectively, allows resources to be efficiently allocated to provide individual care and improve patient outcomes.
Improving the quality of life for young and old
RPM promises to change the way healthcare is delivered with patients themselves directly benefiting. Sensors and wearables linked to smart devices improve communication between providers and patients. Effective monitoring leads to improved patient quality of life as they require fewer visits to healthcare services. Real-time tracking of patient health means any early health warnings can be detected early and the patient can quickly receive preventative care.
One of the most prominent markets for RPM technology is in the care of chronic diseases, most prevalent in the elderly population. According to the World Health Organization, between 2015 and 2050, the proportion of the world's population over 60 will nearly double from 12% to 22%. This is partly why there is a disproportionate amount of emphasis on this population demographic when it comes to healthcare spending; long-term care is often required alongside strict medicine routines.
The future of RPM
There are immediate wins with in-home monitoring; chronic disease, pre/post-operative and post-discharge. In the wellness market, RPM of babies and the elderly will continue to grow. However, looking forward, wearable sensor technologies need to become less invasive and more invisible, a trend which is already tracking positively with technologies such as electronic skin patches. More and more market players are starting to take note, which should drive this growth. Clinical trials, real-world evidence and medical adherence will be major areas of focus for the pharma sector as organisations are realizing RPM has a significant role to play.
What drives the pace of change in RPM?
IDTechEx research shows there is a significant reason to believe that development in healthcare sensors and wearables will keep pace with these demographic changes. Our latest research shows that digital disruption of the healthcare sector will drive the medical wearables market to $19.7 billion by 2024. The report, Remote Patient Monitoring 2019-2029, provides examples of RPM technologies including electronic skin patches, smart mattresses, smart shirts, smartwatches, connected inhalers and digital pills. This report cuts across the expert analysis of wearable technology, wearable sensors, electronic textiles, electronic skin patches and digital health, to bring to healthcare providers the most relevant insights on the exciting innovations in RPM.
Find out more
The inaugural Healthcare Sensor Innovations 2019 conference will bring together healthcare experts, innovators, companies and end-users to focus on the latest emerging RPM technologies. Taking place 25-26th September 2019, in Cambridge, UK, the event provides a platform to share best practice advice on the use of wearables and sensors in continuous monitoring of individuals and point-of-care diagnostics. Register for the event now by visiting www.IDTechEx.com/cambridge
This issue of Digitalisation World includes a major focus on augmented and virtual reality technologies, with particular reference as to how they already are, and will be, impacting the enterprise world. The feature includes a couple of articles on how AR and VR are likely to find a role within the business world, along with a range of expert comments as to how and where AR and VR might just be making a difference in the workplace, both now and into the future. Part 5.
To say that technology has transformed every aspect of our personal and working lives is a bit like saying the sky is blue. Technology has pivoted to become more immersive and collaborative, enabling connections between people, wherever they are in the world, in myriad ways, says Jonathan Bridges, Chief Innovation Officer at Exponential-e.
Augmented reality (AR) and Virtual Reality (VR) are the latest technology transformations. We can see evidence of this all around us: Pokémon GO, Snapchat, and Apple have brought AR joy to countless consumers; Sony, Oculus, and HTC have endeavoured to make VR mainstream in the home.
AR and VR are creating fresh revenue sources while rendering certain legacy technologies obsolete. Although initial expenditure will be sizeable, there’s no doubt that businesses can make their money back through the productivity benefits gained.
Importantly, businesses which fail to adapt to these new technologies won’t survive. In the wake of Brexit, whichever form it may take, British businesses will have to re-negotiate their relationships across Europe and the rest of the world, so it’s crucial that they’re equipped with the right tools to achieve this.
Unfortunately, a global report from Capgemini found that UK companies were lagging behind France and Germany in their adoption of VR / AR technology — only 36 per cent admitted to using the tech, compared to 48 per cent in France.
As such, it’s never been more important for UK businesses to embrace the advantages that VR / AR technologies can bring. From a cloud perspective, they will be particularly pertinent in ushering in a new era in unified communications. From phone calls and conference lines to high-resolution virtual conferencing, companies are continually looking for ways to bring physically disparate people closer together. VR could successfully ‘place’ people into virtual meeting rooms.
Take architecture for example, one of the biggest challenges architects have is demonstrating how a finished structure will look. AR and VR can overcome this, as such, they have become an integral part of the design process and client presentation. Architects can effortlessly showcase realistic project images to potential clients and stakeholders, enabling the latter to make changes and give feedback on designs (or approval) in no time.
The immersive and collaborative structures which AR/VR bring to technology means that seamless cloud-based collaboration between contractors, engineers and architects is both possible and highly desirable.
A final application of AR/VR in the workplace will be the desktop of the future. Previous decades have seen computing limited to 2D screens on laptops, tablets and smartphones; however, the rise of this technology facilitates a move beyond the screen into physical 3D spaces, enhancing and expanding the amount of information that can be displayed.
AR and VR are by no means introducing a digital utopia though. Inevitably, for business owners, conversations must be had on the prevalence of this technology and the potential impact on employees. An ‘always on’ culture needs to be avoided to protect the individual privacies of staff when using VR/AR too, an important point for healthy company culture. That balance is vital to strike.
Bradlee Allen, Product Evangelist at Fuze, comments:
Whether it’s interacting with customers and partners or sharing ideas and meeting with colleagues, businesses are built on communication. However, as workplace trends such as flexible working become more popular, it is critical that, if not physically present, employees can talk, share and interact as productively and effectively as if they were there in person.
While many businesses might foresee ‘trendy’ technology, such as VR, enabling virtual meetings via futuristic hologram, the reality is that these innovations are not ready for the enterprise quite yet. In the meantime, businesses must look for a simpler solution to lay the groundwork.
Virtual workforces will only be truly effective and possible in the future, if organisations introduce common communication tools and applications that are available to every employee, regardless of the device they are using and where they are based. This will help future-proof offices, keep staff motivated, creative and productive and help to give these people a true work-life balance.
However, with technologies like AR and VR making noise and creating headlines, it can be hard to distinguish the hype from genuine innovation. In the face of constant transformation, business leaders need to avoid getting caught up in tech trends that will not make a difference to the workers. IT leaders may be ready to make radical changes, but these must be aligned with the demands of the workforce and not driven by technological developments alone. Those who prioritise the employee experience and productivity above future-gazing technologies will thrive in the new era of collaborative change.
The most important thing for businesses to remember is that workplace technology needs to enhance the employee experience and provide businesses with an opportunity to improve productivity and collaboration. This could mean introducing new VR tools, such as virtual reality meeting spaces, or it could simply mean providing better versions of the technologies that already exist. What matters most is that employees are being given a choice to use the technologies they feel comfortable with, and that allow them to work and collaborate in a way that suits them.
CEO at Hidden Creative, Matt Trubow’s work and relationships within the engineering, power generation, defence and maritime industries has provided him with first-hand experience in the application of these technologies as well as clear insight as to how our clients would like to harness this tech in the future:
The most obvious observation is that Industry 4.0 technologies, particularly virtual and mixed reality, are paving the way for teams to easily distribute knowhow, move faster, respond more quickly, be safer and achieve higher overall operational precision.
From a practical perspective, virtual reality creates safe, cost-effective and realistic training solutions and together with mixed reality technologies, provides the platform required to create a blended ‘learn while working’ environment that facilitates the seamless transfer of knowledge from industry experts to the less experienced portion of the workforce and, where appropriate, on to the customer. This knowledge exchange can be distributed globally at the click of button 24/7. Virtual training, even for sophisticated systems, can be developed for the customer prior to product delivery, ensuring familiarisation and operational competency is reached before product commissioning.
Product development is another area that virtual and mixed reality technologies will enhance. These technologies enable the creation of an engineering grade virtual representation of a new product or system. Within a virtual environment, engineers can perform maintenance and other operational tasks on existing design iterations, essentially perfecting the product refinement processes. This can be done in a fraction of time and cost associated when compared with traditional product refinement methods. This is before considering the additional insight gained by product designers and the pre-production familiarisation the engineering teams obtain during the process.
The commercial service prospects for virtual and mixed reality technologies are vast. By implementing a mixed reality solution, one master technician can support a global workforce of low-level technicians remotely. Virtual assistance can be obtained instantly by customers, and information, animations and other visual instructional aids make remote assistance clear and simple. Support models will need to be expanded to accommodate Industry 4.0 technologies and the seamless bilateral transfer of knowledge, knowhow and customer insight.
Companies that embrace these technologies will thrive off the inherent efficiencies of enabling their workforce as well as the amazing things a creative mind can think of crafting with this tech in order to solve real-world problems.
The technical curve is decreasing, the digital landscape is changing more rapidly than ever, and the transformation Industry 4.0 technologies are making is empowering.
As the cloud services market has developed, many businesses have forged exclusive service provider partnerships to benefit from the scale, agility and performance capabilities now offered by numerous large global hyperscalers. While their enormous success is testament to the way they have met the needs of businesses worldwide, many organisations have additional requirements relating to application performance, legacy applications, data hosted under certain jurisdictions, and data security.
By Eltjo Hofstee, Managing Director, Leaseweb UK.
In a world of cloud choice, where different options support diverse use-cases, for many businesses, there simply is no “one cloud fits all” solution. As a result, IT leaders who want to spread their investment beyond a single cloud partner must allow for various solutions in their architecture planning. This is where hybrid cloud comes in.
The effect of this is that many modern businesses are procuring more than one type of cloud service – from infrastructure to applications. This kind of hybrid cloud strategy considers the need for integration between all types of cloud services and, for those who see it as an important point, help avoid the potential issues associated with being locked-in to a single cloud service provider.
Like any approach to cloud adoption, making sure that a hybrid strategy achieves what its users require needs careful forward planning to understand key factors. These range from identifying the best execution venues for different workloads to ensuring that contracts and agreements work well for everyone involved.
Different apps need different cloud infrastructure
Hybrid cloud computing offers a multitude of options for application hosting and development. To get the most out of a hybrid architecture, the workload must suit – and be optimised for – the infrastructure.
By classifying your organisation’s portfolio of applications and data processing requirements, you can break down the application landscape and architecture to plan the best path to migrating to the hybrid cloud.
Legacy applications can be difficult to upgrade and modernise, for example, but with a sound hybrid cloud strategy the business can progressively deploy new services without the risks of a huge migration project. The legacy applications can stay where they are, and there might even be options, if required, to connect them to one or more of your cloud providers.
In terms of Infrastructure-as-a-Service, every hybrid cloud journey should start by reviewing the balance between the applications’ CPU performance, storage and scalability, and how frequent the application will require a “peak load” operation. With these parameters defined, the workloads can be mapped to the most appropriate infrastructure. For example, the lowest performance requirements tend to be suited to a VPS and the highest might be best suited to bare metal or dedicated servers.
Focusing on business outcomes
IT and business leaders must also focus on business outcomes and quickly adopt the most optimal architecture. A number of processes can be put in place to ensure the hybrid cloud technology choices are benefiting the business including, changing the way people view IT infrastructure management.
It is important to think about the business benefits not only during a migration, but in the long term. Put something in place that ensures your business can take advantage of the flexibility of the hyperscalers, in combination with the potentially more competitively priced cloud or dedicated server elsewhere. Alternatively, use hyperscalers for your application servers. While the database servers are hosted within a specific jurisdiction outside any public cloud, this infrastructure can be set up as a hybrid cloud. Once an application landscape is moved to hybrid then be flexible to start optimising the architecture.
For example, if your company has a traditional business, undertaking e-commerce in a hybrid cloud environment allows it to be more flexible with how the information is processed and stored.
How to control hybrid cloud contracts and agreements
There’s little value in adopting a hybrid cloud strategy if it doesn’t deliver on cost and value for money. By following a few guidelines, it’s possible to make the contract process less disruptive to the business and avoid unnecessary expenses.
Creating a successful cloud contract must involve ensuring everyone is aligned. For example, an organisation can switch from one architecture to another only to discover there are layers that are not covered by the contract.
IT administrators think about migrations, system maintenance and monitoring, but will this be done by your new cloud provider? It is important to go over processes to ensure all services are covered as they were before. All parts of the service delivery process you used to do need to be implemented in some way regardless of where it is hosted. Additionally, make sure the monitoring and system maintenance can be automated over the full hybrid cloud, or can be done in a similar way, so you don’t have to do all management and monitoring differently for every cloud you use.
For further cost benefits, it might be good to differentiate your peak load from your base load. Your base load might be on longer term contracts, while it might be more worthwhile to have your peak load on pay-as-you-go terms.
Take a close look at the cloud contracts, understand what your responsibilities are and make note of what the changes might be. Often managers assume the techies will “make it work” when moving to the cloud. If the business requirements are not well communicated or understood by both sides a poor selection of cloud infrastructure for the application can result.
Another good practice is to review the scope of the new platform and check back to see if the initial migration goal aligns with long-term strategies. For example, a business requirement for big data processing might involve high-capacity cloud services, which are being paid for, but not utilised, most of the time.
Ultimately, each business needs to reach a decision on what version of cloud works best for them. Fundamental to the whole cloud concept are choice and versatility. For some, a strategic partnership with a single provider is ideal, while for others, the blend offered by hybrid cloud offers the flexibility that always comes by working effectively with a portfolio of partners.
In an era of digital transformation, the modern workplace is morphing into a new definition. The meaning of trust is also changing. Security management now exists beyond on premise infrastructure and individual devices, where modern working practices enable employees to connect to workplace tools and resources from any location, any time, and from all devices.
By Jesper Frederiksen, UK GM, Okta.
A study by CEBR, commissioned by Lenovo, found that 57% of the UK’s labour force (approx. 15.2 million people) work in roles that are sufficiently non-physical and eligible for remote working. The sheer scale of people that could potentially be working from home, and often want to work outside of the 9-5 tradition, is enormous. That means sensitive company information is no longer bound to the office.
This new flexibility around where and when a person works brings in fresh challenges controlling and managing access to corporate resources. Organisations that enable bring-your-own-device (BYOD) and flexible working policies have no option but to rethink their network perimeter and security policies. In short: the days of building a moat around your castle are long gone.
Organisations can no longer assume that trusted users will access the corporate network onsite and within the protections of the firewall. When work is increasingly done outside the safety of a corporate network, managing and enforcing trust based on the physical perimeter is not only extremely difficult - it’s insecure.
In this new world, the only commonality is the user accessing some resource, so security architects now build new trust networks based on identity and authentication to mitigate risks.
The concept of Zero Trust has emerged as a reaction to the modern digital landscape. A core component of Zero Trust assumes all people and devices accessing or processing the data are untrusted by default. All access to corporate resources is restricted until the user has successfully proven they are who they say they are, which can be based on a number of factors – from a password, to a hard token like a YubiKey – as well as context on their typical sign-on habits. Contextual and continuous authentication are also recommended to monitor any changes to that information.
To build a successful Zero Trust strategy founded on identity, there are three important stages to consider:
1. Unify identity
The first and most essential stage is to consolidate fragmented user identities under one Identity and Access Management (IAM) platform across the cloud and all devices, in order to manage both authorised and unauthorised access. This entails single sign-on (SSO) for all users, from customers to the full extended enterprise of employees, contractors and partners. On top of this, using a second factor of authentication with SSO adds another layer of protection to mitigate attacks targeting credentials.
2. Grant access contextually
The second step in the implementation of Zero Trust is the application of context-based access policies. This involves identifying the user’s context, application context, device context, location and network, and applying policies based on that information.
In a cloud and mobile world, where people access resources and data from multiple devices at any given time and location, this step is critical in managing access based on contextual insights. For example, if a known user attempts to authenticate from their usual, trusted work laptop, but they are in a foreign country on a public Wi-Fi network, the Zero Trust policy could automatically increase the level of authentication required – such as requiring both a password and a second factor.
A contextual access policy is useful when security teams must account for the risks associated with lost and stolen devices, as one example. By enabling authentication based on varied signal inputs, organisations can mitigate the possibility of lost phones giving away company data when landing in the hands of outsiders.
3. Authenticate continuously
In the final stage of Zero Trust implementation, identity is continuously measured with adaptive, risk-based assessments to identify potential threats throughout the user’s experience. This involves the application of intelligent, risk-based mechanisms to create a risk-score and tolerance measure based on the contextual information received. This adaptive and continuous identity assessment means trust is no longer absolute – it is assessed against all variables at all times.
Implementing a Zero Trust strategy that establishes identity as the new perimeter will not only secure corporate resources by ensuring only verified users are granted access, it will also help companies maintain the mobility and flexibility that today’s workers expect. With the right authentication protocols in place, users will be able to use any device and work from anywhere they choose.
This issue of Digitalisation World includes a major focus on augmented and virtual reality technologies, with particular reference as to how they already are, and will be, impacting the enterprise world. The feature includes a couple of articles on how AR and VR are likely to find a role within the business world, along with a range of expert comments as to how and where AR and VR might just be making a difference in the workplace, both now and into the future. Part 6.
AR and VR applications sometimes focus on entertainment or offer a completely practical ‘tool’ solution, it all depends on the business needs of the brief, according to Christophe Castagnera, Head of Connected Experiences of Creative Agency, Imagination:
There is an interesting middle ground, such as Training, with the need to create an application that offers ‘edutainment’ whilst also providing hard facts for the users to learn. Here are 4 interesting use cases to demonstrate the breadth and depth of this vibrant sector.
Feeling the future (B2B)
An entire interactive and immersive world was created by HSBC on VR headsets for investors at a key event in Davos. The experience showcased the ambitious Belt and Road initiative and how it will connect and change our relationship with the vast expanse of central asia. Using VR meant that users ‘felt’ the Belt and Road project around them, so they were far more emotional engaged in being a part of the investment programme.
Immersive edutainment (B2B and B2C)
Insurance Australia Group created an interactive VR experience that challenged the participant’s knowledge of home safety in a unique and immersive way, called ‘The FIRST PLACE’. Once in the virtual reality living room, players have 90 seconds to identify 12 everyday hazards. After the experience, they also have the option to learn more about the dangers of each hazard presented
5G will take Immersive Technology to the next level (B2B)
Vodafone are heavily involved in the world of 5G and have various programmes to communicate the benefits of their 5G network. One of the great things about 5G is that is reduces the delay or latency of information to devices such as VR headsets or Haptic suits, even over large distances. Vodafone brought this to life through a live media demonstration where a Rugby player 100 miles away tackled a padded cylinder, the impact was transferred instantly and with dramatic effect to another Rugby player at the live media event itself, who was wearing a haptic suit.
Jaguar electrifies (B2C and B2C)
Jaguar revealed their all electric I-PACE concept car to the world using a mix of VR and live theatre. A unique experience for media in LA involved 66 networked HTC headsets, to explore the car through a VR story. Guests in LA were then joined in VR by 12 guests in London for a special media Q&A with the designer Ian Callum. This content was later used for a Retailer Immersive experience to allow customers to see the car in VR before it hit the showrooms for real.
With over 23 years of experience in working in this space with our clients, Imagination knows the value of these solutions. We created an ‘Immersive Spectrum’ five years ago to map these business goals, whether they are AR, VR or a blend. This tool allows us to work with our clients to understand their business strategy and define the right solution.
Oluwaseyi Sosanya, co-founder of Gravity Sketch and a Member of the Royal Academy of Engineering Enterprise Hub, comments:
Gravity Sketch is novel software that lets people effortlessly sketch and develop 3D models for industrial design in a fast and intuitive way through the use of VR, and touch technologies.
“AR and VR technology is increasingly being employed by businesses to drive efficiencies and overcome designers’ frustrations in effectively translating their ideas into computer models. At Gravity Sketch, we employ VR to create an immersive 3D design tool that enables designers to quickly ideate, visualise and communicate creative concepts in real-time.
We have found that bridging the gap between the second and third dimension has helped troubleshoot a number of problems associated with the design process, which has ultimately driven efficiency, shaving weeks to months’ off design time.
Interestingly, we have also found that removing some of the frustrations associated with the design process has improved employee satisfaction, allowing people to work more collaboratively and gain a greater sense of authority over their ideas.
For too long, our industry has focused on the entertainment and lifestyle applications of AR and VR, but much like the phone and computer we can’t expect to unlock its wider societal potential until it is fully integrated into enterprise.
Currently, the technology is being used by design teams and engineers, but for AR and VR to really succeed in enterprise it needs to become an everyday business tool that is embedded into the normal workflow.
We expect to see greater demand for the technology as improvements are made to 5G, which will improve its accessibility on portable devices, and as the overall the cost of the technology decreases.
Eventually, we will reach a stage where any business that is talking about the third dimension - whether that is a physical product, flight path or relief map – will be providing employees with a headset as a third desktop display.”
Fabio Torlini, Managing Director Europe at WP Engine, says:
Like any new technology, the first few years of VR have been spent trying to force fit it into the mould of how people have traditionally consumed media. The potential of VR is less about how VR can mimic what people are already doing today, but rather in its narrative potential and ability to achieve “presence” in a digital form.
Sports, for example, require fans to spend a lot of money if they want the best seats in the house. With VR, fans can have front row seats for every game or concert, at a fraction of the cost. VR can also offer experiences that are impossible in the physical world by giving users the ability to shift their perspective, in real-time, essentially enabling them to be their own producer or even to replay certain events.
We’re also seeing early applications of AR/VR in the retail world with brands providing a way for customers to test or see how a product works or trial how a piece of furniture might look in their home. Short of actually buying the sofa and putting it in your home, this would have been impossible in the past. Certain car manufacturers are also seeing success this way by allowing them to sit in the driver's seat without ever stepping into a dealership.
One of the great things about technology is its potential to make things more accessible - more experiences, more information, better insights, etc. Open-source technologies like WordPress are levelling the playing field by making it simple for small teams, startups, and individuals to integrate VR into their digital experiences. As the barrier to entry for new users continues to fall as mobile processors improve, the need for thousand pound headsets becomes irrelevant and any brand, sports team or person can share these new VR experiences with anyone.
The methods needed by CIOs to gain full visibility of their organisation’s estate.
By Paul Hampton, Senior Director of Product Marketing, Alfresco.
Today organisations invariably use around 5-10 content management platforms. However, in some extreme cases this number can be well over 50. That’s over 50 places in one company where a document could be created, saved, edited, updated and shared – this is content management chaos. Content management nirvana, on the other hand, is a single company-wide content management system where everything is stored.
Most companies sit somewhere between nirvana and chaos. It’s easy for content to get shared across multiple platforms, such as Google Drive, DropBox, Skype, email or Microsoft SharePoint. The number of content tools can rise as a result of acquisitions or simply due to departments or employees going rogue and using their own tools to satisfy their “unique needs”. These can have dangerous consequences, especially when sensitive corporate information is stored in non-secure locations.
Dangerous repercussions
The problem with using multiple content management platforms is that there’s no single source of truth. If an end-user needs to find a file, where is the most recent version of the document stored? Perhaps it was shared with a partner on DropBox, so people end up working off the non-master version of the document, meaning that out-of-date documents such as terms and conditions, price lists and contracts could still be in use.
Mistakes can then arise from multiple people working on different versions of a document, which could mean that important updates are missed. How many times have you seen “version 7” or “final version” added to a file name? But can you really trust this? When there is a need to cross reference information in multiple areas then the organisation becomes inefficient – time is wasted searching, decisions are made using outdated information and there’s extra work in consolidating various versions of content.
There are also regulatory and legal restrictions that require companies to centralise content. Litigation can become a nightmare. If an organisation finds itself being taken to court, documents will become vital and it will be difficult to locate them if they are spread across umpteen different systems. All systems would have to be audited to find everything about a specific case, whether it be an IP infringement, a HR disciplinary incident or some such matter.
This can be a time consuming and cumbersome process, meaning it is often outsourced, which in turn costs a company even more money. This is just one example of the security, risk and compliance issues surrounding content management; let’s not even dive into the GDPR rabbit hole.
Banishing chaos
So, how do organisations move towards content management nirvana? First, there needs to be a regular audit. Checking existing business systems and understanding why they are being used. Is there a real need for them or have they been adopted by an individual or department to get around IT or use a feature they don’t think was available? Look for things like ROT, which is to mean redundant, outdated or trivial content – things you can just get rid of, that are costing too much to manage and store, and those that you don’t need to maintain. Finally, check network traffic and see what’s being accessed so that any applications going under the radar can be identified. Remember, it is not always the most obvious places like email, how many people have put documents on a USB stick and lost them? Think about all the sorts of ways information is leaked, lost or placed in other areas.
Data hoarding is also a big problem. Introducing retention schedules as part of information governance capability can enable identification of what needs to be kept, why and for how long. There are also information lifecycle procedures to move data from one storage area to another. Data may need to be retained past the time when it is actively being used for compliance reasons. Rather than keeping data on expensive hard drives, it can be migrated to cheaper and cheaper storage – cold storage, so that it can still be accessed if necessary.
Moving forward
The messy patchwork of legacy systems exacts a high price. It’s costly to maintain and a drag on operational efficiency, user productivity and IT resources. It’s also a huge barrier to innovation. IT organisations wrestling with siloes of information stored in closed systems based on decades-old technology can’t easily deliver great customer experiences, maximise efficiency, benefit from the agility and cost savings of the cloud and can’t respond fast enough to new business requirements. Lagging behind in digitalisation is an emerging fear for businesses. In a recent survey of 300 digital transformation decision makers, the majority (87%) say their business results would be impacted by a technically innovative competitor.
When business and IT join together in harmony, then the benefits reaped are massive. The aforementioned survey yielded some interesting results regarding technology valuation: expected business benefits from digital transformation include improved employee productivity (74%), decreased costs (71%), increased customer satisfaction (62%). Decision makers forecast that there’s long-term advantages in eliminating the technological cacophony caused by different outdated systems. In ushering in a replacement unified system, a business may encounter some teething issues, but the consequential economic and productivity boost generated by migration is an example of why we should embrace the digital age and its modern solutions – especially when managing content. Don’t let the sprawl take over.
With digital technology dominating the modern workforce, simplified and effortless IT platforms are in high demand.
By Dominik Birgelen, CEO at oneclick.
For any new tech software, accessibility for the end user is one of the highest priorities as an organisation is committed to providing a full ecosystem of applications needed for employees to do their work. Therefore, certain elements must be considered when choosing these platforms. However, some are more vital than others. Below are 7 of the most crucial aspects to creating a better end-user computing experience.
Virtual Desktop Infrastructure (VDI)
VDI transfers desktop management from a local environment to a virtualised one. This means deployment, management, and maintenance of replacement endpoints becomes much easier for overworked IT department. Routine upgrades, installations, and more can be completed without user interaction whatsoever. This enables enhanced productivity for the end users as they can work undisturbed, using the latest innovative software without the installation hassle. In addition to this, the user isn't’ tied to just one hardware device with VDI as the desktop can easily be transferred from one device to another with efficient timing.
Desktop as a Service (DaaS)
DaaS shares the same elements of VDI but with one vital difference - instead of working as an in-house application, DaaS is based within the cloud, providing employees with a cloud computing work platform, hosted by the company’s cloud provider. It is a cost-effective and secure way of working, containing all the same features of VDI but with increased flexibility and productivity. In addition, there are no upfront costs and users only pay for what they use on a monthly basis. The platform is accessible from any location with an internet connection, meaning updates and installations can be carried out whilst the end user is out of the office, making the process that much more efficient.
HTML5
HTML5 defines a web page. It is cross-platform, which means it works whether you're using a tablet or a smartphone, a netbook, notebook, ultrabook or a Smart TV. This quality enables consistency between all platforms and has a cost-effective outcome for end users. Along with this, HTML5 gives an overall better digital experience by widening the range of design and presentation features across various media types, giving the tools to produce better web sites and applications. This is essential from a business point of view, as creating an accessible and usable site or system means that users will be more likely to engage.
Bring your own device (BYOD)
In the past, employees bringing their personal devices to the workplace was problematic due to security risks. Whilst businesses did their best to dissuade staff from using their own devices, there were multiple examples of an organisation’s data being compromised as a result of BYOD. the. However, if a user withholds a secure cloud platform, this is something now worth considering. The password accessible platform enhances flexibility for the individual's preferred working space, allowing them to work wherever there is access to the internet. This means the employee has a comfortable experience using a device they are familiar with, whilst the employer experiences the financial benefits by not having to purchase individual devices, keeping both parties happy.
Collaboration
Working in a digital, collaborative environment gives end users a fast responding and cooperative way of working. There are many different channels that enable end users to work in this way, including, email, Skype, team workspaces and document sharing solutions. The platforms allow end users to work openly, which can be time effective and increase the accuracy of work. Rather than employees having individual workspaces, colleagues with the appropriate access can view relevant documents and emails. Furthermore, it offers a platform for virtual meetings to take place, saving time for all those involved.
Security within the cloud
As business technology evolves, so do the hazards associated with the increasingly digital workplace. Businesses are under constant threat of being hacked or having their information exposed. In order to solve this problematic issue, cybersecurity can be moved to the cloud which has the ability to hold and protect a large amount of data that can be accessed by selected end users. Along with all the data being processed—are dynamic, scalable, and portable. If a user’s hard drive is stolen, the multi-device login will prevent loss of data. Additionally, If the cloud detects unusual activity, it will respond to the events, avoiding any disruption for end users.
Productivity and working remotely
Using the cloud allows end users to work in a remote space, away from the traditional office area. This increases employers flexibility and satisfaction, which will lead to more efficient working. By personalising their own preferred workspace, users will encounter fewer distractions that may occur in the workspace, these common workplace distractions are estimated to cost the US economy $600 billion a year. With the cloud, users have the option to work from home, which will resolve multiple issues at work. For example, if an employee is sick or unable to come in, they are able to complete work. This is beneficial for both ends, the employee is able to keep up to date with their work in their absence and the employer can still monitor this productivity, whilst being in a different location.
The way businesses work has transformed significantly in recent years and for many IT departments, keeping up with these transformations can be challenging. However, by bringing these digital elements into one platform, end users will no longer fear the transition to a digital platform and will have the ability to work in an open but secure virtual environment. Organisation, efficiency and cooperation are just a small portion of benefits end users will experience when making this transition, businesses just need to choose the right software platforms to enable this.
One of the main drivers in the digital business transformation of companies is the data strategy. Whereas in the past this was influenced by rapid developments in the areas of data-driven business and data economy, today the focus is clearly on customer benefit. Oliver Schiffers from Publicis Sapient explains what this means for future data strategies and which elementary components they should contain.
Business is changing rapidly. Until a few years ago, marketers in particular used data almost exclusively to realize optimization and efficiency potentials in the media environment with a tactical reach, but this was never about real growth. Decision makers are finally beginning to recognize what really counts: the relationship with the customer. It is a company's most important asset. Before this paradigm shift, topics such as data monetarization and targeting take a back seat to strategic considerations. Likewise, the mistaken belief that in the search for the "Next Best Action", algorithms and machine learning can create an advantage or even a unique selling proposition over competitors becomes secondary.
The new target in the use and analysis of data is therefore customer centricity. A complete and comprehensive data strategy must offer both exponential growth and optimization potential - in the sense of a digital business transformation. It should consider methodologies, customer and business benefits, rather than being dominated by technology and tools alone. Those responsible, who orient themselves on the following five steps, score points sustainably:
1. Create liquid data structures
The aim is to make the data already available in the company, liquid: in other words, by making specific information available in the data, customer needs are to be addressed, the user experience personalised and new digital products and services developed. In order to generate real business benefits in this way, company and customer-specific metadata is always necessary. These describe the behaviour and requirements of the customers or the own products as exactly as possible and go beyond the typical data width, which is raised by competitors or standard solutions. Data Scientists rely on this metadata for meaningful and unique results. However, they are often neglected in the data collection. As a result, data for the orchestration of digital customer contacts across all tools and channels is missing. A one-off or selective enrichment is not sufficient here. Rather, it is necessary to systematically and structurally check and optimize the data collection.
2. Understanding data-driven as both culture and process
A data-driven corporate culture and the associated business benefits require fundamental changes in all areas of the company. These need to be addressed step-by-step and affect all processes, procedures and decision making along the digital customer journey. Only if decision-makers can experiment, formulate the right questions, draw up hypotheses and neatly integrate and demonstrate the lessons learned in each case can the change to a data-driven culture succeed. Even at this stage, the mindset of all those involved in the process must turn in the direction of customer benefit.
3. Become the master of your own data
The basis for a successful data strategy is a rethinking in the management of tool providers, agencies and service providers: data and metadata must be completely under corporate sovereignty. This is because the use of standard solutions or agencies that collect data for companies often leads to a worrying reduction in data quality due to privacy concerns and a decline in user confidence. In addition, the breadth and depth of the underlying data is often determined by the standard functionalities of the solutions used. This prevents companies from carrying out additional activities to improve their own data quality. If, on the other hand, marketing and sales departments collect and analyze data and metadata themselves in the future - in the spirit of data ownership - business benefits will arise that are not defined by the best practices and use cases of the large providers, and at the same time limited. The result is a completely original digital business model based on the company's own data and deep user relationships.
4. Give and take: Trust and customer value exchange
For the highest possible data depth and width, it is not sufficient to only act in accordance with regulations with regard to privacy. Companies must develop individual privacy concepts, and provide and communicate clear evidence of customer benefits online. In individual cases, companies may want to remunerate customers for their data. And to communicate transparently how and for what purpose their data is used.
Long-term success can only be guaranteed by a clean exchange of collected customer data for actual solutions to customers' problems and an improved customer experience. At the same time, this is fundamental for decision-makers in the company if they want to focus on customer (lifetime) value and develop their own comprehensive digital business models.
5. Building digital ecosystems
A sustainable digital business transformation always requires methods to keep the own business model competitive in the long run and to protect it against imitators and competitors. Often, however, corresponding approaches to digital business strategies are based solely on economies of scale or the development of a platform economy of one's own - a mostly unrealistic and always expensive undertaking. However, if companies develop digital products and services, a first step towards protecting the business model can be achieved. The next step should be to set up a digital ecosystem in which the products and services are integrated. The ecosystem should include equal partners and third-party providers who cooperate with each other and exchange data and algorithms for their services. This will enable all participants to develop new digital products that build on each other. Complementary algorithms, the transfer of individual knowledge and common augmented models of analyses and decisions lead to the highest possible customer benefit. This added value ensures the uniqueness of the new digital business model based on the ecosystem.
An innovative data strategy is no longer efficiency-driven, but aims to generate unique selling propositions and new, sustainable digital business models. The path to this goal requires individual solutions as well as a rethink that prominently incorporates customer benefit into the overall approach.
Neural networks have become the alchemy of our age, the search for a magical, mystical process that allows you to turn a pile of data into gold. It is widely seen as a silver bullet that can generate new insights and expert decisions on an unprecedented speed and scale.
By Ben Taylor, CTO at AI-powered, automated decision making platform Rainbird.
Yet this ignores the reality that ‘deep learning’ systems are difficult to create or audit and most organisations lack the necessary in-house expertise or ‘data hygiene’ to use it effectively. Despite a recent Oxford University survey finding that AI will have a 50% chance of outperforming humans in all tasks in just 45 years, there are numerous complex tasks where human intervention is imperative in order to provide the level of transparency needed.
Machine-learning systems derive insights from such complex probabilities and correlations that only a trained data scientist can begin to understand them. This means machine-learning can be a closed book to the very experts in the business that most depend on it and can therefore have the effect of disempowering other employees. ML systems are also prone to producing irrational and inexplicable decisions because it is difficult to work out whether the algorithm derived its decisions from some unseen variable in the data, such as an unnoticed feature of an image.
Neural networks also cannot think outside the context of their ‘learning environment’ and thus a neural network is only as good as the data it was trained on. This means they are prone to inheriting biases from data; if an AI is trained to autonomously filter job candidates by analysing a sample of previous recruits, it might reject female candidates if 80% of the training sample happened to be male. Because neural networks do not follow human rules of logic, they can be prone to spurious correlations. For example, an insurance AI analysing poorly-structured driver data could decide to increase premiums for Renault owners just because Renaults happened to be over-represented among a sample of dangerous drivers.
It is also a myth that neural networks can work ‘off the shelf’ without human intervention. Effective deployment of deep learning systems requires expert data scientists to assist everything from procurement to configuration, auditing and data hygiene. This makes it increasingly expensive to implement neural networks because the requisite data science talent is a scarce and increasingly in-demand resource. Neural networks can supplant and exclude human workers because they make complex judgments based on data science beyond the comprehension of laymen; if an AI uses some ‘black box’ technique for detecting fraud, it leaves the human fraud squad unable to replicate its success.
The fact that neural networks can only be configured and trained by data scientists means that nobody else in the organisation understands how they work. This hampers the ability of relevant subject matter experts to audit AI decisions internally and undermines an organisation’s ability to justify those decisions to regulators and customers. Indeed, the ability to reproduce human thinking with machines enables human expertise to be spread and scaled at speed across a company or a country. The process of capturing and codifying human expertise and experience for AIs not only allows machines to reproduce human judgements but also enables the secrets behind expert decisions to be explained and taught to other human employees, helping up-skill the existing workforce. This makes limited human resources stretch further and enables organisations to rapidly respond to increased demand and manpower shortages.
Since AIs will increasingly be helping human professionals from lawyers to fraud prevention teams, it makes more sense for their human ‘colleagues’ to be involved in customising and auditing them. The only answer is a return to ‘rules-based’ AI systems that reflect human thinking and can therefore be configured and audited by relevant subject matter experts.
Capturing and codifying human expertise for machines also enables the secrets behind expert decision-making to be learned and taught to other human employees, helping reproduce skills more widely across the human workforce. Encoding human expertise for auditable machines empowers employees to turn their expertise into a ‘blueprint’ for best practice across a business which can improve machines and humans alike, bringing consistency to the whole organisation’s performance. They also free up human experts to concentrate on more strategic tasks. Rules-based algorithms augment rather than replace human talent, making humans smarter rather than taking their jobs away.
By Steve Hone CEO and Cofounder, The DCA
The DCA explain why they support R&D and provide details of DCA Partner EcoCooling’s involvement in a project to build and manage the most efficient data centre in the world.
Investing time in research can deliver real benefits to business. Many successful companies, such as those producing consumer goods or mass-market items, invest heavily in research and development (R&D). Industries include computer software, semiconductor, information and communication technology, robotics and energy all have high R&D expenditure – R&D is critical to product innovation, the improvement of services and can also help to secure major advantages over competitors. Taking the time to examine products offerings and identifying improvements can differentiate one organisation from another. In fact, R&D can contribute to raising a company’s market value and to boosting profitability.
From inception the DCA Trade Association has actively championed R&D initiatives and projects, this has enabled us to maintain strong ties with the academic world and to build enduring connections with major players in the data centre sector. These organisations remain committed to maintaining the health and sustainability of the data centre sector.
Over the last eight years, through continued collaboration with academic, strategic and corporate partners the DCA has successfully helped to secure R&D funding for EU Commission projects such as PEDCA, the award winning EURECA project and the DEW COOL project which is a collaboration between Europe and China.
The DCA has also supported dozens more EU commission R&D projects including OPERA, SLALOM and the ICT Footprint project to name but a few.
Contact the DCA to find out more about research projects info@dca-global.org
The DCA frequently facilitates networking and introductions that allow our partners to meet and discuss collaboration and research ideas and projects. For two years DCA corporate partner EcoCooling have been involved in an EU Horizon 2020 project with RI.SE and Boden Business Agency to build the most efficient data centre in the world.
EcoCooling’s report below provides interesting reading.
Since 2017, DCA member EcoCooling has been involved in an EU Horizon 2020 funded ground-breaking pan-European research project to build and manage the most efficient data centre in the world! With partners H1 Systems (project management), Fraunhofer IOSB (compute load simulation), RISE (Swedish Institute of Computer Science) and Boden Business Agency (Regional Development Agency) a 500kW data centre has been constructed using the very latest energy efficient technologies and employing a highly innovative holistic control system. In this article we will provide an update on the exciting results being achieved by the Boden Type Data Centre 1 (BTDC-1) and what we can expect from the project in the future.
Fig 1. Boden Type DC in Boden, Sweden
The project objective: To build and research the world’s most energy and cost-efficient data centre.
The BTDC is in Sweden, where there is an abundant supply of renewable and clean hydro-electricity and cold climate ideal for free cooling. Made up of 3 separate research modules/pods of Open Compute/conventional IT, HPC and ASIC (Application Specific Integrated Circuit) equipment, the EU’s target was to design a data centre with a PUE of less than 1.1 across all of these technologies. With only half of the project complete, the facility has already demonstrated PUEs of below 1.02, which we believe is an incredible achievement.
Boden Type One fully populated design
The highly innovative modular building and cooling system was devised to be suitable for all sizes of data centres. By using these construction, cooling and operation techniques, smaller scale operators will be able to achieve achieve or better the cost and energy efficiencies, of hyperscale data centres.
We all recognise that PUE has limitations as a metric, however in this article and for dissemination we will continue to use PUE as a comparative measure as it is still widely understood.
Exciting First Results - Utilising the most efficient cooling system possible
At BTDC-1, one of the main economic features is the use of EcoCooling’s direct ventilation systems with optional adiabatic (evaporative) cooling which produces the cooling effect without requiring an expensive conventional refrigeration plant.
This brings two facets to the solution at BTDC-1. Firstly, in the very hot or very cold, dry days, the ‘single box approach’ of EcoCoolers can switch to adiabatic mode and provide as much cooling or humidification as necessary to maintain the IT equipment environmental conditions within the ASHRAE ‘ideal’ envelope, 100% of the time.
With the cooling and humidification approach I’ve just outlined, we were able to produce very exciting results.
Instead of the commercial data centre norm of PUE 1.8 or 80% extra energy used for cooling. We have been achieving a PUE of less than 1.05, lower than the published values of some data centre operators using ‘single-purpose’ servers – but we’ve done it with General Purpose OCP servers. We’ve also achieved the same PUE using high density ASIC servers.
This is an amazing development in the cost and carbon footprint reduction of the data centres. Let’s quickly look at the economics of that applied to a typical 100kW medium size data centre. The cooling energy cost is dropped from £80,000 to a mere £5,000. That’s a £75,000 per year saving in an average 100kW medium size commercial data centre.
Pretty amazing cost (and carbon) savings I’m sure you’d agree.
Smashing 1.05 PUE – Direct linking of server temperature to fan speed
What we did next has had truly phenomenal results using simple process controls. What has been achieved here can be simply replicated in conventional server. The ultra-efficient operation can only be achieved if the main stream server manufacturers embrace these principles. I believe this presents a real ‘wake-up’ call to conventional server manufacturers - if they are ever to get serious about total cost of ownership and global data centre energy usage.
You may know that within every server, there are multiple temperature sensors which feed into algorithms to control the internal fans. Mainstream servers don’t yet make this temperature information available outside the server.
However, one of the three ‘pods’ within BTDC-1 is kitted out with about 140kW of Open-Compute servers. One of the strengths of the partners in this project is that average server measurements have been made accessible to the cooling system. At EcoCooling, we have taken all of that temperature information into the cooling system’s process controllers (without needing any extra hardware). Normally, processing the cooling systems are separate with inefficient time-lags and wasted energy. We have made them close-coupled and able to react to load changes in milliseconds rather than minutes.
As a result, we now have BTDC-1 “Pod 1” operating with a PUE of not 1.8, not 1.05, but 1.03!
Pod 1 – Boden Type DC One
The BTDC-1 project has demonstrated a robust repeatable strategy for reducing the energy cost of cooling a 100kW data centre from £80,000 to a tiny £3,000.
This represents a saving of £77,000 a year for a typical 100kW data centre. Now consider the cost and environmental implication of this on the hundreds of new data centres anticipated to be rolled out to support 5G and “edge” deployment.
Planning for the future - Automatically adjusting to changing loads
An integrated and dynamic approach to DC management is going to be essential as data centre energy-use patterns change.
What do I mean? Well, most current-generation data centres (and indeed the servers within them) present a fairly constant energy load. That is because the typical server’s energy use only reduces from 100% when it is flat-out to 75% when it’s doing nothing.
At BTDC-1, we are also designing for two upcoming changes which are going to massively alter the way data centres need to operate.
Firstly, the next generations of servers will use far less energy when not busy. So instead of 75% quiescent energy, we expect to see this fall to 25%. This means the cooling system must continue to deliver 1.003 pPUE at very low loads. (It does.)
Also, BTDC-1, Pod 1 isn’t just sitting idly drawing power – our colleagues from the project are using it to emulate a complete SMART CITY (including the massive processing load of driverless cars). The processing load varies wildly - with massive loads during the commuter traffic ‘rush hours’ in the weekday mornings and the afternoons. And then (comparatively) almost no activity in the middle of the night. So, we can expect many DCs (and particularly the new breed of ‘dark’ Edge DCs) to have wildly varying power and cooling load requirements.
Call to Intel, Dec, Dell, HP, Nvidia et al
At BTDC-1, we have three research pods. Pod 2 is empty - waiting for one or more of the mainstream server manufacturers to step up to the “global data centre efficiency” plate and get involved.
The opportunity for EcoCooling to work with RISE (Swedish institute of computer science) and German research institute Fraunhofer has allowed us to provide independent analysis and validation of what can be achieved using direct fresh air cooling.
The initial results are incredibly promising and considering we are only half way through the project we are excited to see what additional efficiencies can be achieved.
So come on Intel, Dec, Dell, HP, Nvidia and others: Who’s brave enough to get involved?
By Dr Umaima Haider, Research Fellow, University of East London
EURECA was a three-year-long (March 2015 - February 2018) project funded by the European Commission’s Horizon 2020 Research and Innovation programme, with partners from the United Kingdom, Ireland, Netherlands and Germany. It aimed to help address the data centre energy efficiency challenge in the European public sector. It would support public authorities to adopt a modern and innovative, procurement approach.
EURECA reinforced the consolidation of newly built and retrofit data centre projects in various European countries, with a focus on Public Procurement for innovation (PPI). Additionally, EURECA supported the development of European standards, best practices and policies related to energy efficiency in data centres and green public procurement. This was done by providing scientific evidence and data.
For further information, please visit the EURECA project website: www.dceureca.eu
Key Results
The EURECA team specifically designed various innovative, engagement methodologies that used state-of-the-art models and tools. The project supported consolidation, new build and retrofit data centre projects in member states. This resulted in savings of over a 131 GWh/year of primary energy (that’s 52.5 GWh/year of end-use energy) from immediate pilots, supported within the project lifetime in Ireland, Netherlands and United Kingdom (plus various ongoing ones in other EU member states). This equated to more than 27.83 thousand tCO2/year savings, with annual electricity bill savings of €7.159M. This was achieved from working on pilots involving 337 data centres.
EURECA influenced various initiatives related to policy, such as:
EURECA contributed to several standards, including the EN50600 series on data centres. EURECA team members also play an active role in developing the EU Code of Conduct for data centre energy efficiency.
Finally, EURECA trained over 815 stakeholders through 10 face-to-face training events, held across Europe. To know more about EURECA training, please read this article[1].
The feedback from the European Commission received on the overall evaluation of the project stated that:
"The project has delivered exceptional results with significant immediate or potential impact”.
Data Centre Market Directory
As part of the EURECA project, a vendor-neutral open market directory was established for the European data centre market. This directory currently lists over 250 data centre products and services available to the European market. It is hosted by the ECLab[2] –EURECA coordinator.
So, if your business provides data centre related products and/or services to the European market (irrespective of company size), you are welcome to list your offerings here (DCdirectory.eu) for FREE.
Policy Recommendations
A scientific research on hardware refresh rates was done under the EURECA project[3]. This was referenced by the Amsterdam Economic Board, Netherlands in June 2018. The report provided policy guidance in the field based on the findings of the work above [4].
In September 2018, European member states voted to implement regulation (EU) No 617/2013 under Directive 2009/125/EC. It focussed on the Ecodesign requirements for servers and online data storage products. Computer Weekly interviewed the EURECA coordinator who was a key player in supporting the legislation, and who shared some of the research findings supporting evidence provided to the policymakers[5].
EURECA in the news
In February 2018, the Computer Weekly magazine published an interview with the project coordinator on the energy consumption of public sector data centres. The article discussed the EURECA project and revealed for the first time some of the project findings such as the size and running cost of European public sector data centres [6].
In October 2018, the BBC Reality Check team interviewed Dr Rabih Bashroush about the energy consumption of streaming media and the overall trends in data centre energy consumption. Based on this, they published an article titled: “Climate change: Is your Netflix habit bad for the environment?”[7]
Next steps
The rich body of knowledge produced by the EURECA project, along with the impact already created, ensures a lasting legacy well beyond the project lifetime.
The EURECA team plans to:
[1] Umaima Haider "Building the Capacities and Skills of Stakeholders in the Data Centre Industry", DataCentre Solutions (DCS) Europe Magazine, July 2018 https://tinyurl.com/y83n29ex
[2] Enterprise Computing Research Group, University of East London, UK https://www.eclab.uel.ac.uk
[3] Rabih Bashroush. "A Comprehensive Reasoning Framework for Hardware Refresh in Data Centres.” IEEE Transactions on Sustainable Computing, 2018. https://doi.org/10.1109/TSUSC.2018.2795465
[4] "Circulaire Dataservers" (in Dutch), Amsterdam Economic Board, Netherlands, June 2018.
URL: https://www.amsterdameconomicboard.com/app/uploads/2018/06/Circulaire-Dataservers-Rapport-2018.pdf
[5] "EU-backed bid to cap idle energy use by datacentre servers moves closer". Computer Weekly, 19 September 2018 https://www.computerweekly.com/news/252448914/EU-backed-bid-to-cap-idle-energy-use-by-datacentre-servers-moves-closer
[6] “The EURECA moment: Counting the cost of running the UK’s public sector datacentres”. Computer Weekly, 20 February 2018, http://www.computerweekly.com/feature/The-EURECA-moment-Counting-the-cost-of-running-the-UKs-public-sector-datacentres
[7] "Climate change: Is your Netflix habit bad for the environment?". BBC, 12 October 2018
Wendy Torell, Senior Research Analyst, Schneider Electric Data Center Science Center
Today’s data centres are responsible for delivering many of the services on which the digital economy has become dependent. They have become an interconnected network, which can be stratified in three distinct layers: larger centralised and hyperscale facilities; regional data centres serving local needs; and at the edge of the network, the smallest, most highly-integrated and autonomous solutions, providing low-latency connectivity and specialised applications closer to the point of use.
One of the main challenges for many organisations is that edge data centres must offer similar levels of reliability, availability and security as the largest hyperscale facilities occupied by Internet Giants. However, they but must do so without the comfort of permanent on-site technical personnel and be cost-effective to design, build and maintain, whilst remaining quick to deploy in response to growing business demands.
The issue is further compounded as many of today’s businesses, for example retail providers, may have to maintain a larger volume of distrubuted edge facilities across multiple locations. Standardisation, in addition to Cloud-based, mobile-friendly data centre infrastructure (DCIM) management software, such as Schneider Electric’s EcoStruxure IT, becomes crucial as it’s neither feasible, nor cost-effective to have permanent IT staff available at evey site.
This presents a connundrum for many of today’s edge advocates, one that requiries a collaborative response from specialists within the critical infrastructure and IT industries.
The characteristics of the ecosystem
The requirement at the edge demands that an integrated ecosystem, including products from various vendors, OEM’s and IT providers, can be quickly assembled into customised solutions to meet customers’ specific needs. This, in turn means that an ecosystem of partners, working together, is essential to ensure interoperability, faster deployments with less errors, and more efficient and effective operation of edge computing infrastructure. It is this group of partners and vendors who make up for the lack of onsite staffing and having to do this across multiple, geographically-dispersed sites.
Such an ‘edge ecosystem’ comprises four distinct elements clustered around the end-user, who defines the key business requirements the solution will fulfil. It might be a retail-automation application for a branch store with very tight space restrictions; or a video-analytics application requiring integration with specialist non-IT equipment. Whatever the need, it will affect the demands imposed on the other members of the ecosystem.
First and foremost are the IT vendors themselves whom supply the servers, storage, networking and software necessary to support the customer’s business objectives. It is rare for a single vendor to be able to supply a “one-stop shop” for all components, so interoperability between different technologies, preferably based on industry standards, is essential. Despite the trend towards hyper-convergence, with all products integrated into one device, the nature of the industry is that continuous innovation will bring new capabilities embedded within new technologies, which must operate with other equipment to be effective.
Secondly, there are the physical infrastructure vendors who provide the critical IT, power and cooling equipment, which ensures the solution remains secure and operational. This category includes vendors of racks, uninterruptible power supplies (UPS), power distribution units (PDUs), cooling systems and environmental monitoring equipment. They will also provide physical security systems to prevent against unwarranted intrusion by unauthorised personnel.
To drive speed of deployment with customised solutions tailored to a specific application, infrastructure vendors must also deliver tools that allow their products to be pre-assembled and pre-configured to a high standard, but with the minimum of effort. Such tools will include reference designs; blueprints of physical infrastructure solutions designed and built to a number of specifications whilst meeting various customer needs. Some vendors also offer rules-based configurator tools to make it much easier and faster for a System Integrator or the end user to build a specific infrastructure solution for a given IT stack.
Online software applications including for example, Schneider Electric’s data centre tradeoff tools, allow customers to model the cost and integration implications of choosing one type of product over another, whilst simplifying the design and specification process. Finally, management tools with public application programming interfaces (APIs) allow the integration of their products into third-party management software tools for remote monitoring and maintenance.
The third element of the ecosystem comprises systems integrators, companies that bring together IT and infrastructure products into a bespoke solution for a specific customer. Such organisations have specialist intargration knowledge and expertise regarding all aspects of the data centre, including close relationships with many of the key vendors. They can determine the optimal design of a data centre based on their knowledge of the market and their insight into available technologies.
Cloud-based, mobile-friendly software becomes crucial
The final element comprises the on-going management and successful operation of a data centre once it is up and running. Increasingly this falls into the remit of Managed Service Providers (MSP’s), who will provide monitoring and maintenance services according to a strict service level agreement. The nature of edge data centres is that they are usually small installations, often in remote locations where it is neither practical, nor cost-effective to have permanent technical staff on-site. Hence the need for specialist service providers to manage the on-going operations is further compounded.
Key to enabling success for Managed Service Providers is Cloud-based data centre infrastructure (DCIM) management software, such as Schneider Electric’s EcoStruxure IT. Here sensors and firmware installed on the individual network-connected products provide status information to a gateway app which then passes the information to a server in the cloud. Through a combination of cloud-based software, mobile devices, apps and analytics, such data can be aggregated into an overall management solution and made available to trusted, specialist service personnel anywhere and on any device. In this way scheduled maintenance operations such as upgrades and component replacements can be managed remotely and proactively planned, whilst enaling rapid response in the case of an unplanned event or outage.
Conclusion
Naturally, the four elements of the ecosystem surrounding the customer may not need to be provided by four separate entities. Many large organisations will have specialist service personnel within their IT division who can provide remote managed services to branch organisations within the company. Conversely, many IT vendors and infrastructure providers may have their own systems-integration capabilities, and in turn many specialist systems integrators are also able to provide on-going managed services as part of their portfolio.
Nevertheless, the requirements for a successful implementation of a hybrid IT architecture comprising central, regional and edge data centres demands appreciation of the distinct elements in the ecosystem and a determination to ensure that they can and will interact as seamlessly and as efficiently as possible.
Within the ecosystem, collaboration between critical infrastructure Vendors, IT providers, Original Equipment Manufacturers (OEM’s), MSP’s and Systems Integrators, is essential when providing the end-user with the right solution. This includes the key products and services needed throughout the edge lifecycle, starting with the configuration, assembly and delivery, through to operations and maintenance.
Far from being an appendage to an organisation’s IT operation, edge computing solutions are typically located at the very point of interaction with customers. As such, their performance, reliability and availability are every bit as vital as the most carefully maintained and generously manned hyperscale or centralised data centre at the heart of an organisation. Paying attention to all the elements in the ecosystem will help to ensure both their effective operation and 24/7 availability for today’s digital businesses.
To support IT professionals in developing a strategy to deploy IT at the edge, Schneider Electric has released a new White Paper entitled “Solving Edge Computing Infrastructure Challenges”, which is immediately available for free download.